Archive Home arrow Reviews: arrow Video Cards arrow NVIDIA GeForce GTX 650 Ti BOOST
NVIDIA GeForce GTX 650 Ti BOOST
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Tuesday, 26 March 2013

NVIDIA GeForce GTX 650 Ti BOOST Video Card

Manufacturer: NVIDIA Corporation
Product Name: GeForce GTX 650 Ti BOOST
Retail Price: Starting at $169.99 (Amazon | Newegg)

Full Disclosure: The product sample used in this article has been provided by NVIDIA.

With the economy on the rebound, gamers are coming out of hibernation with a hunger for modern DirectX11 graphics and realism. Since AMD has all but disappeared from the scene, NVIDIA has timed their affordable mainstream video card launch perfectly. Based on the NVIDIA Kepler GK106 architecture, the GeForce GTX 650 Ti BOOST delivers 2GB of 1502 MHz GDDR5 memory and 768 CUDA Cores operating at 980 MHz up to 1033+ with NVIDIA GPU Boost technology. In this article, Benchmark Reviews tests the NVIDIA GeForce GTX 650 Ti BOOST graphics card using several highly-demanding DX11 video games.

By tradition, NVIDIA's GeForce GTX series offers enthusiast-level performance with features like multi-card SLI pairing. More recently, the GTX family has included GPU Boost application-driven variable overclocking technology. The GeForce GTX 650 Ti BOOST graphics card keeps with tradition in terms of performance by offering the capable GK106 GPU with 768 CUDA cores clocked to 980 MHz with 2GB of GDDR5 vRAM. Later next month the GTX 650 Ti BOOST will become available in a less-expensive ($150) 1GB GDDR5 version. Of course, NVIDIA's Kepler GPU architecture adds proprietary features to both versions, such as: 3D Vision, Adaptive Vertical Sync, multi-display Surround, PhysX, and TXAA antialiasing.

The NVIDIA GeForce GTX 650 Ti BOOST features a reference design that includes a 28nm Kepler GK106 GPU, which houses four SMX units and offers 768 CUDA Cores operating at a fixed base clock speed of 980 MHz with 64 texture units. There's 2048 MB of GDDR5 video memory good for 144.2 GB/s bandwidth over a 192-bit interface, all clocked to 1502 MHz (6008 MHz data rate). In comparison to GeForce GTX 650, the new GTX 650 Ti BOOST offers twice the number of CUDA Cores and texture units, with faster core clocks and much more memory bandwidth. GTX 650 Ti BOOST adds faster speeds, more memory, and GPU Boost technology to the GTX 650 Ti.

NVIDIA-GeForce-GTX-650-Ti-BOOST-Angle.jpg

Of the many platforms available for gamers to enjoy video games, there's no question that the highest quality graphics come from PC. While game developers might not consider PC gaming as lucrative as entertainment consoles, companies like NVIDIA use desktop graphics to set the benchmark for smaller more compact designs that make it into notebooks, tablets, and smartphone devices. NVIDIA's Kepler GPU architecture is an example of this, delivering unprecedented performance while operating cooler and consuming far less power than previous generation graphics cards. Gamers upgrading from the GeForce 9600 GT may see up to 600% performance increase, or 200% increase over the GeFroce GTX 550 Ti.

GeForce GTX 650 Ti BOOST offers all the same high-end features found on the top-end GTX video cards but with a much more affordable price tag. In addition to a new and improved Kepler GPU architecture with NVIDIA GPU Boost technology, the GeForce GTX 650 Ti BOOST video card delivers further refinement to the user experience. Smoother FXAA and adaptive vSync technology results in less chop, stutter, and tearing in on-screen motion. Adaptive vSync adjusts the monitor's refresh rate whenever the FPS rate becomes too low to properly sustain vertical sync, thereby reducing stutter and tearing artifacts. NVIDIA TXAA helps deliver a film-style anti-aliasing technique with a mix of hardware post-processing, custom CG file style AA resolve, and an optional temporal component for better image quality.

NVIDIA's product stack includes support for the following graphics cards (as of March 2013):

  • GeForce GTX TITAN
  • GeForce GTX 690
  • GeForce GTX 680
  • GeForce GTX 670
  • GeForce GTX 660 Ti
  • GeForce GTX 660
  • GeForce GTX 650 Ti BOOST
  • GeForce GTX 650 Ti
  • GeForce GTX 650
  • GeForce GT 640
  • GeForce GT 630
  • GeForce GT 620
  • GeForce GT 610
  • GeForce 210

First Look: GeForce GTX 650 Ti BOOST

This review examines the NVIDIA GeForce GTX 650 Ti BOOST video card, featured in reference design, available for $169.99 (Amazon | Newegg). The GeForce GTX 650 Ti BOOST is a 1.5" tall double-bay, 3.9" wide, 9.5" long graphics card that will fit into nearly all mid-tower computer case enclosures with room to spare. GeForce GTX 650 Ti BOOST shares an identical profile with GeForce GTX 670, which makes it shorter than NVIDIA's GeForce GTX 570, GeForce GTX 580, as well as AMD's Radeon HD 6970, and Radeon HD 7970 (each 10.5" long).

NVIDIA-GeForce-GTX-650-Ti-BOOST-Top.jpg

A rear mounted 60mm (2.4") blower motor fan with a slight offset takes advantage of the chamfered depression to draw cool air into the angled fan shroud, allowing more air to reach the intake whenever two or more video cards are combined in close-proximity SLI configurations. NVIDIA's add-in card partners with engineering resources may incorporate their own cooling solution into the GTX 650 Ti BOOST, but most brands are likely to adopt the cool-running reference design.

NVIDIA-GeForce-GTX-650-Ti-BOOST-Power.jpg

Specified at 136W Thermal Design Power output, the GeForce GTX 650 Ti BOOST requires less power than its predecessor and several other flagship products. Because TDP demands have been reduced, NVIDIA's GeForce GTX 650 Ti BOOST has also reduced power supply requirements to a single six-pin PCI-E power connection located to the side of the video card so it fits better into small enclosures (illustrated above).

GeForce GTX 650 Ti BOOST offers two simultaneously functional dual-link DVI (DL-DVI) connections, a full-size HDMI 1.4a output, and a DisplayPort 1.2 connection. Add-in partners may elect to remove or possibly further extend any of these video interfaces, but most will likely retain the original engineering. Only one of these video cards is necessary to drive triple-displays and NVIDIA 3D-Vision Surround functionality, when using both DL-DVI and either the HDMI or DP connection for third output. All of these video interfaces consume exhaust-vent real estate, but this has very little impact on cooling because the 28nm Kepler GPU generates much less heat than past GeForce processors, and also because NVIDIA intentionally positions the heatsink far enough from these vents to equalize exhaust pressure.

NVIDIA-GeForce-GTX-650-Ti-BOOST-IO-Bracket.jpg

As with past-generation GeForce GTX series graphics cards, the GeForce GTX 650 Ti BOOST is capable of SLI - but limited to two-card configurations. Because GeForce GTX 650 Ti BOOST is PCI-Express 3.0 compliant device, the added bandwidth could potentially come into demand as future games and applications make use of these resources. Most games work well using moderate settings on a single GeForce GTX 650 Ti BOOST graphics card, but multi-card SLI configurations are perfect for gamers wanting to experience high-performance video games played at their best quality settings with bells and whistles enabled.

NVIDIA-GeForce-GTX-650-Ti-BOOST-Angle.jpg

The memory subsystem on GeForce GTX 650 Ti BOOST delivers a 2GB GDDR5 video frame buffer that produces 144.2 GB/s total memory bandwidth at a noteworthy 6008 MHz data rate. Three memory controllers combine to build a 192-bit memory lane, which moves data more efficiently than previous designs to yield a fill rate of 62.7 GigaTexels per second. GeForce GTX 650 Ti BOOST is backwards-compatible PCI-Express 3.0 compliant graphics device, although card's 192-bit memory interface makes it unlikely the added bandwidth will ever be fully saturated by the demands of this video card.

NVIDIA-GeForce-GTX-650-Ti-BOOST-PCB-Back.jpg

The card's exposed printed circuit board at the backside reveals an interesting discovery: GeForce GTX 650 Ti BOOST uses a much smaller PCB then the video card profile suggests. Past GeForce products generally use a shroud to cover the entire length of the circuit board, but with GTX 650 Ti BOOST the PCB measures only 7.0" of this 9.5" card, with a 2.5" extension to support the cooling fan.

In the next section, we detail our test methodology and give specifications for all of the benchmarks and equipment used in our testing process...

VGA Testing Methodology

The Microsoft DirectX-11 graphics API is native to the Microsoft Windows 7 Operating System, and will be the primary O/S for our test platform. DX11 is also available as a Microsoft Update for the Windows Vista O/S, so our test results apply to both versions of the Operating System. The majority of benchmark tests used in this article are comparative to DX11 performance, however some high-demand DX10 tests have also been included.

GPU-Z-NVIDIA-GeForce-GTX-650-Ti-BOOST-Video-Card.pngIn each benchmark test there is one 'cache run' that is conducted, followed by five recorded test runs. Results are collected at each setting with the highest and lowest results discarded. The remaining three results are averaged, and displayed in the performance charts on the following pages.

A combination of synthetic and video game benchmark tests have been used in this article to illustrate relative performance among graphics solutions. Our benchmark frame rate results are not intended to represent real-world graphics performance, as this experience would change based on supporting hardware and the perception of individuals playing the video game.

Intel X79 Express Test System

DirectX-10 Benchmark Applications

  • Crysis Warhead v1.1 with HOC Benchmark
    • Settings: Airfield Demo, Very High Quality, 4x AA, 16x AF

DirectX-11 Benchmark Applications

  • 3DMark11 Professional Edition by Futuremark
    • Settings: Performance Level Preset, 1280x720, 1x AA, Trilinear Filtering, Tessellation level 5)
  • Aliens vs Predator Benchmark 1.0
    • Settings: Very High Quality, 4x AA, 16x AF, SSAO, Tessellation, Advanced Shadows
  • Batman: Arkham City
    • Settings: 8x AA, 16x AF, MVSS+HBAO, High Tessellation, Extreme Detail, PhysX Disabled
  • BattleField 3
    • Settings: Ultra Graphics Quality, FOV 90, 180-second Fraps Scene
  • Lost Planet 2 Benchmark 1.0
    • Settings: Benchmark B, 4x AA, Blur Off, High Shadow Detail, High Texture, High Render, High DirectX 11 Features
  • Metro 2033 Benchmark
    • Settings: Very-High Quality, 4x AA, 16x AF, Tessellation, PhysX Disabled
  • Unigine Heaven Benchmark 3.0
    • Settings: DirectX 11, High Quality, Extreme Tessellation, 16x AF, 4x AA

PCI-Express Graphics Cards

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit
  • AMD Radeon HD 7770 GHz (1000 MHz GPU/1125 MHz vRAM - AMD Catalyst 12.8)
  • NVIDIA GeForce GTX 650Ti (925 MHz GPU/1350 MHz vRAM - Forceware 306.38)
  • NVIDIA GeForce GTX 650Ti BOOST (980 MHz GPU/1033 MHz Boost/1502 MHz - Forceware 314.21)
  • AMD Radeon HD 6970 (880 MHz GPU/1375 MHz vRAM - AMD Catalyst 12.8)
  • NVIDIA GeForce GTX 580 (772 MHz GPU/1544 MHz Shader/1002 MHz vRAM - Forceware 304.38)
  • NVIDIA GeForce GTX 660 (980 MHz GPU/1033 MHz Boost/1502 MHz vRAM - Forceware 306.23)
  • ASUS GeForce GTX 660Ti (915 MHz GPU/980 MHz Boost/1502 MHz vRAM - Forceware 306.23)
  • AMD Radeon HD 7970 (925 MHz GPU/1375 MHz vRAM - AMD Catalyst 12.8)
  • NVIDIA GeForce GTX 670 (915 MHz GPU/980 MHz Boost/1502 MHz vRAM - Forceware 306.23)

DX10: Crysis Warhead

Crysis Warhead is an expansion pack based on the original Crysis video game. Crysis Warhead is based in the future, where an ancient alien spacecraft has been discovered beneath the Earth on an island east of the Philippines. Crysis Warhead uses a refined version of the CryENGINE2 graphics engine. Like Crysis, Warhead uses the Microsoft Direct3D 10 (DirectX-10) API for graphics rendering.

Benchmark Reviews uses the HOC Crysis Warhead benchmark tool to test and measure graphic performance using the Airfield 1 demo scene. This short test places a high amount of stress on a graphics card because of detailed terrain and textures, but also for the test settings used. Using the DirectX-10 test with Very High Quality settings, the Airfield 1 demo scene receives 4x anti-aliasing and 16x anisotropic filtering to create maximum graphic load and separate the products according to their performance.

Using the highest quality DirectX-10 settings with 4x AA and 16x AF, only the most powerful graphics cards are expected to perform well in our Crysis Warhead benchmark tests. DirectX-11 extensions are not supported in Crysis: Warhead, and SSAO is not an available option.

  • Crysis Warhead v1.1 with HOC Benchmark
    • Settings: Airfield Demo, Very High Quality, 4x AA, 16x AF

Crysis_Warhead_Benchmark.jpg

Crysis Warhead Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: 3DMark11

FutureMark 3DMark11 is the latest addition the 3DMark benchmark series built by FutureMark corporation. 3DMark11 is a PC benchmark suite designed to test the DirectX-11 graphics card performance without vendor preference. Although 3DMark11 includes the unbiased Bullet Open Source Physics Library instead of NVIDIA PhysX for the CPU/Physics tests, Benchmark Reviews concentrates on the four graphics-only tests in 3DMark11 and uses them with medium-level 'Performance' presets.

The 'Performance' level setting applies 1x multi-sample anti-aliasing and trilinear texture filtering to a 1280x720p resolution. The tessellation detail, when called upon by a test, is preset to level 5, with a maximum tessellation factor of 10. The shadow map size is limited to 5 and the shadow cascade count is set to 4, while the surface shadow sample count is at the maximum value of 16. Ambient occlusion is enabled, and preset to a quality level of 5.

3DMark11-Performance-Test-Settings.png

  • Futuremark 3DMark11 Professional Edition
    • Settings: Performance Level Preset, 1280x720, 1x AA, Trilinear Filtering, Tessellation level 5)

3dMark11_Performance_GT1-2_Benchmark.jpg

3dMark11_Performance_GT3-4_Benchmark.jpg

3DMark11 Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Aliens vs Predator

Aliens vs. Predator is a science fiction first-person shooter video game, developed by Rebellion, and published by Sega for Microsoft Windows, Sony PlayStation 3, and Microsoft Xbox 360. Aliens vs. Predator utilizes Rebellion's proprietary Asura game engine, which had previously found its way into Call of Duty: World at War and Rogue Warrior. The self-contained benchmark tool is used for our DirectX-11 tests, which push the Asura game engine to its limit.

In our benchmark tests, Aliens vs. Predator was configured to use the highest quality settings with 4x AA and 16x AF. DirectX-11 features such as Screen Space Ambient Occlusion (SSAO) and tessellation have also been included, along with advanced shadows.

  • Aliens vs Predator
    • Settings: Very High Quality, 4x AA, 16x AF, SSAO, Tessellation, Advanced Shadows

Aliens-vs-Predator_DX11_Benchmark.jpg

Aliens vs Predator Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Batman Arkham City

Batman: Arkham City is a 3d-person action game that adheres to story line previously set forth in Batman: Arkham Asylum, which launched for game consoles and PC back in 2009. Based on an updated Unreal Engine 3 game engine, Batman: Arkham City enjoys DirectX 11 graphics which uses multi-threaded rendering to produce life-like tessellation effects. While gaming console versions of Batman: Arkham City deliver high-definition graphics at either 720p or 1080i, you'll only get the high-quality graphics and special effects on PC.

In an age when developers give game consoles priority over PC, it's becoming difficult to find games that show off the stunning visual effects and lifelike quality possible from modern graphics cards. Fortunately Batman: Arkham City is a game that does amazingly well on both platforms, while at the same time making it possible to cripple the most advanced graphics card on the planet by offering extremely demanding NVIDIA 32x CSAA and full PhysX capability. Also available to PC users (with NVIDIA graphics) is FXAA, a shader based image filter that achieves similar results to MSAA yet requires less memory and processing power.

Batman: Arkham City offers varying levels of PhysX effects, each with its own set of hardware requirements. You can turn PhysX off, or enable 'Normal levels which introduce GPU-accelerated PhysX elements such as Debris Particles, Volumetric Smoke, and Destructible Environments into the game, while the 'High' setting adds real-time cloth and paper simulation. Particles exist everywhere in real life, and this PhysX effect is seen in many aspects of game to add back that same sense of realism. For PC gamers who are enthusiastic about graphics quality, don't skimp on PhysX. DirectX 11 makes it possible to enjoy many of these effects, and PhysX helps bring them to life in the game.

  • Batman: Arkham City
    • Settings: 8x AA, 16x AF, MVSS+HBAO, High Tessellation, Extreme Detail, PhysX Disabled

Batman-Arkham-City-Benchmark.jpg

Batman: Arkham City Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Battlefield 3

In Battlefield 3, players step into the role of the Elite U.S. Marines. As the first boots on the ground, players will experience heart-pounding missions across diverse locations including Paris, Tehran and New York. As a U.S. Marine in the field, periods of tension and anticipation are punctuated by moments of complete chaos. As bullets whiz by, walls crumble, and explosions force players to the grounds, the battlefield feels more alive and interactive than ever before.

The graphics engine behind Battlefield 3 is called Frostbite 2, which delivers realistic global illumination lighting along with dynamic destructible environments. The game uses a hardware terrain tessellation method that allows a high number of detailed triangles to be rendered entirely on the GPU when near the terrain. This allows for a very low memory footprint and relies on the GPU alone to expand the low res data to highly realistic detail.

Using Fraps to record frame rates, our Battlefield 3 benchmark test uses a three-minute capture on the 'Secure Parking Lot' stage of Operation Swordbreaker. Relative to the online multiplayer action, these frame rate results are nearly identical to daytime maps with the same video settings.

  • BattleField 3
    • Settings: Ultra Graphics Quality, FOV 90, 180-second Fraps Scene

Battlefield-3_Benchmark.jpg

Battlefield 3 Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Lost Planet 2

Lost Planet 2 is the second installment in the saga of the planet E.D.N. III, ten years after the story of Lost Planet: Extreme Condition. The snow has melted and the lush jungle life of the planet has emerged with angry and luscious flora and fauna. With the new environment comes the addition of DirectX-11 technology to the game.

Lost Planet 2 takes advantage of DX11 features including tessellation and displacement mapping on water, level bosses, and player characters. In addition, soft body compute shaders are used on 'Boss' characters, and wave simulation is performed using DirectCompute. These cutting edge features make for an excellent benchmark for top-of-the-line consumer GPUs.

The Lost Planet 2 benchmark offers two different tests, which serve different purposes. This article uses tests conducted on benchmark B, which is designed to be a deterministic and effective benchmark tool featuring DirectX 11 elements.

  • Lost Planet 2 Benchmark 1.0
    • Settings: Benchmark B, 4x AA, Blur Off, High Shadow Detail, High Texture, High Render, High DirectX 11 Features

Lost-Planet-2_DX11_Benchmark.jpg

Lost Planet 2 Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Metro 2033

Metro 2033 is an action-oriented video game with a combination of survival horror, and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in Ukraine and released in March 2010 for Microsoft Windows. Metro 2033 uses the 4A game engine, developed by 4A Games. The 4A Engine supports DirectX-9, 10, and 11, along with NVIDIA PhysX and GeForce 3D Vision.

The 4A engine is multi-threaded in such that only PhysX had a dedicated thread, and uses a task-model without any pre-conditioning or pre/post-synchronizing, allowing tasks to be done in parallel. The 4A game engine can utilize a deferred shading pipeline, and uses tessellation for greater performance, and also has HDR (complete with blue shift), real-time reflections, color correction, film grain and noise, and the engine also supports multi-core rendering.

Metro 2033 featured superior volumetric fog, double PhysX precision, object blur, sub-surface scattering for skin shaders, parallax mapping on all surfaces and greater geometric detail with a less aggressive LODs. Using PhysX, the engine uses many features such as destructible environments, and cloth and water simulations, and particles that can be fully affected by environmental factors.

NVIDIA has been diligently working to promote Metro 2033, and for good reason: it's one of the most demanding PC video games we've ever tested. When their flagship GeForce GTX 480 struggles to produce 27 FPS with DirectX-11 anti-aliasing turned two to its lowest setting, you know that only the strongest graphics processors will generate playable frame rates. All of our tests enable Advanced Depth of Field and Tessellation effects, but disable advanced PhysX options.

  • Metro 2033 Benchmark
    • Settings: Very-High Quality, 4x AA, 16x AF, Tessellation, PhysX Disabled

Metro-2033_DX11_Benchmark.jpg

Metro 2033 Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

DX11: Unigine Heaven 3.0

The Unigine Heaven benchmark is a free publicly available tool that grants the power to unleash the graphics capabilities in DirectX-11 for Windows 7 or updated Vista Operating Systems. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. With the interactive mode, emerging experience of exploring the intricate world is within reach. Through its advanced renderer, Unigine is one of the first to set precedence in showcasing the art assets with tessellation, bringing compelling visual finesse, utilizing the technology to the full extend and exhibiting the possibilities of enriching 3D gaming.

The distinguishing feature in the Unigine Heaven benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces, so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality transcends conjured by your hand.

Since only DX11-compliant video cards will properly test on the Heaven benchmark, only those products that meet the requirements have been included.

  • Unigine Heaven Benchmark 3.0
    • Settings: DirectX 11, High Quality, Extreme Tessellation, 16x AF, 4x AA

Unigine_Heaven_DX11_Benchmark.jpg

Heaven Benchmark Test Results

Graphics Card Radeon HD7770 GeForce GTX650Ti GTX650Ti BOOST Radeon HD6970 GeForce GTX580 GeForce GTX660 GeForce GTX660Ti Radeon HD7970 GeForce GTX670
GPU Cores 640 768 768 1536 512 960 1344 2048 1344
Core Clock (MHz) 1000 925 980 880 772 980 915 925 915
Shader Clock (MHz) N/A N/A 1033 Boost N/A 1544 1033 Boost 980 Boost N/A 980 Boost
Memory Clock (MHz) 1125 1350 1502 1375 1002 1502 1502 1375 1502
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1536MB GDDR5 2048MB GDDR5 2048MB GDDR5 3072MB GDDR5 2048MB GDDR5
Memory Interface 128-bit 128-bit 192-bit 256-bit 384-bit 192-bit 192-bit 384-bit 256-bit

VGA Power Consumption

In this section, PCI-Express graphics cards are isolated for idle and loaded electrical power consumption. In our power consumption tests, Benchmark Reviews utilizes an 80-PLUS GOLD certified OCZ Z-Series Gold 850W PSU, model OCZZ850. This power supply unit has been tested to provide over 90% typical efficiency by Chroma System Solutions. To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International. In this particular test, all power consumption results were verified with a second power meter for accuracy.

The power consumption statistics discussed in this section are absolute maximum values, and may not represent real-world power consumption created by video games or graphics applications.

A baseline measurement is taken without any video card installed on our test computer system, which is allowed to boot into Windows 7 and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen before taking the idle reading. Our final loaded power consumption reading is taken with the video card running a stress test using graphics test #4 on 3DMark11. Below is a chart with the isolated video card power consumption (system without video card subtracted from measured combined total) displayed in Watts for each specified test product:

Video Card Power Consumption by Benchmark Reviews

VGA Product Description

(sorted by combined total power)

Idle Power

Loaded Power

NVIDIA GeForce GTX 480 SLI Set
82 W
655 W
NVIDIA GeForce GTX 590 Reference Design
53 W
396 W
ATI Radeon HD 4870 X2 Reference Design
100 W
320 W
AMD Radeon HD 6990 Reference Design
46 W
350 W
NVIDIA GeForce GTX 295 Reference Design
74 W
302 W
ASUS GeForce GTX 480 Reference Design
39 W
315 W
ATI Radeon HD 5970 Reference Design
48 W
299 W
NVIDIA GeForce GTX 690 Reference Design
25 W
321 W
ATI Radeon HD 4850 CrossFireX Set
123 W
210 W
ATI Radeon HD 4890 Reference Design
65 W
268 W
AMD Radeon HD 7970 Reference Design
21 W
311 W
NVIDIA GeForce GTX 470 Reference Design
42 W
278 W
NVIDIA GeForce GTX 580 Reference Design
31 W
246 W
NVIDIA GeForce GTX 570 Reference Design
31 W
241 W
ATI Radeon HD 5870 Reference Design
25 W
240 W
ATI Radeon HD 6970 Reference Design
24 W
233 W
NVIDIA GeForce GTX 465 Reference Design
36 W
219 W
NVIDIA GeForce GTX 680 Reference Design
14 W
243 W
Sapphire Radeon HD 4850 X2 11139-00-40R
73 W
180 W
NVIDIA GeForce 9800 GX2 Reference Design
85 W
186 W
NVIDIA GeForce GTX 780 Reference Design
10 W
275 W
NVIDIA GeForce GTX 770 Reference Design
9 W
256 W
NVIDIA GeForce GTX 280 Reference Design
35 W
225 W
NVIDIA GeForce GTX 260 (216) Reference Design
42 W
203 W
ATI Radeon HD 4870 Reference Design
58 W
166 W
NVIDIA GeForce GTX 560 Ti Reference Design
17 W
199 W
NVIDIA GeForce GTX 460 Reference Design
18 W
167 W
AMD Radeon HD 6870 Reference Design
20 W
162 W
NVIDIA GeForce GTX 670 Reference Design
14 W
167 W
ATI Radeon HD 5850 Reference Design
24 W
157 W
NVIDIA GeForce GTX 650 Ti BOOST Reference Design
8 W
164 W
AMD Radeon HD 6850 Reference Design
20 W
139 W
NVIDIA GeForce 8800 GT Reference Design
31 W
133 W
ATI Radeon HD 4770 RV740 GDDR5 Reference Design
37 W
120 W
ATI Radeon HD 5770 Reference Design
16 W
122 W
NVIDIA GeForce GTS 450 Reference Design
22 W
115 W
NVIDIA GeForce GTX 650 Ti Reference Design
12 W
112 W
ATI Radeon HD 4670 Reference Design
9 W
70 W
* Results are accurate to within +/- 5W.

This section discusses power consumption for the NVIDIA GeForce GTX 650 Ti BOOST video card, which operated at reference clock speeds. Our power consumption results are not representative of the entire GTX 650 Ti BOOST-series product family, which may feature a modified design by some partners. GeForce GTX 650 Ti BOOST requires a single 6-pin PCI-E power connections for normal operation, and will not activate the display unless proper power has been supplied. NVIDIA recommends a 450W power supply unit for stable operation with GTX 650 Ti BOOST.

NVIDIA-GeForce-GTX-650-Ti-BOOST-Power.jpg

In our test results the GeForce GTX 650 Ti consumed a mere 8W at the lowest idle reading, and 164W under full load. NVIDIA's average TDP is specified as 136W. This positions the GTX 650 Ti BOOST among the least power-hungry video cards we've ever tested under load, but much more impressive that it's achieved by a GTX-series product. If you're familiar with electronics, it will come as no surprise that less power consumption equals less heat output as evidenced by our thermal results below...

GeForce GTX 650 Ti Temperatures

This section reports our temperature results subjecting the video card to maximum load conditions. During each test a 20°C ambient room temperature is maintained from start to finish, as measured by digital temperature sensors located outside the computer system. GPU-Z is used to measure the temperature at idle as reported by the GPU, and also under load.

Using a modified version of FurMark's "Torture Test" to generate maximum thermal load, peak GPU temperature is recorded in high-power 3D mode. FurMark does two things extremely well: drives the thermal output of any graphics processor much higher than any video games realistically could, and it does so with consistency every time. Furmark works great for testing the stability of a GPU as the temperature rises to the highest possible output.

The temperatures illustrated below are absolute maximum values, and do not represent real-world temperatures created by video games or graphics applications:

Video Card Ambient Idle Temp Loaded Temp Max Noise
ATI Radeon HD 5850 20°C 39°C 73°C 7/10
NVIDIA GeForce GTX 460 20°C 26°C 65°C 4/10
AMD Radeon HD 6850 20°C 42°C 77°C 7/10
AMD Radeon HD 6870 20°C 39°C 74°C 6/10
ATI Radeon HD 5870 20°C 33°C 78°C 7/10
NVIDIA GeForce GTX 560 Ti 20°C 27°C 78°C 5/10
NVIDIA GeForce GTX 570 20°C 32°C 82°C 7/10
ATI Radeon HD 6970 20°C 35°C 81°C 6/10
NVIDIA GeForce GTX 580 20°C 32°C 70°C 6/10
NVIDIA GeForce GTX 590 20°C 33°C 77°C 6/10
AMD Radeon HD 6990 20°C 40°C 84°C 8/10
NVIDIA GeForce GTX 650 Ti BOOST 20°C 26°C 73°C 4/10
NVIDIA GeForce GTX 650 Ti 20°C 26°C 62°C 3/10
NVIDIA GeForce GTX 670 20°C 26°C 71°C 3/10
NVIDIA GeForce GTX 680 20°C 26°C 75°C 3/10
NVIDIA GeForce GTX 690 20°C 30°C 81°C 4/10

As we've mentioned on the pages leading up to this section, NVIDIA's Kepler architecture yields a much more efficient operating GPU compared to previous designs. This becomes evident by the low idle temperature, and translates into modest full-load temperatures. While NVIDIA's reference design works exceptionally well at cooling the GK106 GPU, consumers should expect add-in card partners to advertise unnecessarily excessive over-cooled versions for an extra premium. 73°C after ten minutes at 100% load is nothing at all, and is nowhere close to this card's 98°C thermal threshold.

GeForce GTX 650 Ti BOOST Conclusion

IMPORTANT: Although the rating and final score mentioned in this conclusion are made to be as objective as possible, please be advised that every author perceives these factors differently at various points in time. While we each do our best to ensure that all aspects of the product are considered, there are often times unforeseen market conditions and manufacturer changes which occur after publication that could render our rating obsolete. Please do not base any purchase solely on our conclusion as it represents our product rating specifically for the product tested, which may differ from future versions of the same product. Benchmark Reviews begins our conclusion with a short summary for each of the areas that we rate.

First and foremost is our performance rating. While the prices might not reflect it, GeForce GTX 650 Ti BOOST is posed to replace the aging GeForce GTX 570 since they performed so similarly. The AMD Radeon HD 7770 GHz Edition was easily outperformed in every single benchmark test, so the closest opposing product fell to the AMD Radeon HD 6970. The Radeon HD 7850 could possibly match up well, as well.

In the DirectX 10 game Crysis Warhead, the GeForce GTX 650 Ti BOOST easily surpassed the AMD Radeon HD 7770 GHz Edition, and matched the Radeon HD 6970 overall. DirectX 11 test results continued to keep the GeForce GTX 650 Ti BOOST consistently ahead of its competition in almost all tests. Ultra-demanding DX11 games such as Batman: Arkham Asylum made use of Kepler's optimized architecture, helping to deliver a very playable 59-FPS. Battlefield 3 gave the GeForce GTX 650 Ti BOOST a 5-FPS lead over the Radeon HD 6970, and enabled Ultra quality settings. Lost Planet 2 played well on all graphics cards when set to high quality with 4x AA, allowing GTX 650 Ti BOOST to maintain a 48-FPS frame rate. In Aliens vs Predator the GeForce GTX 650 Ti BOOST actually began to reach GTX 660 performance levels. Metro 2033 is another demanding game that requires high-end graphics to enjoy high quality visual settings, and although this benchmark favors Radeon products the GTX 650 Ti BOOST still kept up a 30-FPS frame rate at 1920x1080.

Synthetic benchmark tools offer an unbiased read on graphics products, allowing manufacturers to display their performance without optimizations or driver influence. Futuremark's 3DMark11 benchmark suite strained our high-end graphics cards with only mid-level settings displayed at 720p, forcing GTX 650 Ti BOOST to produce FPS results higher than AMD's Radeon HD 6970. Unigine Heaven 3.0 benchmark tests used maximum settings that strained GTX 650 Ti BOOST's 192-bit memory bandwidth, yet still compare to the 6970.

NVIDIA-GeForce-GTX-650-Ti-BOOST-Top.jpg

Appearance is a much more subjective matter, especially since this particular rating doesn't have any quantitative benchmark scores to fall back on. NVIDIA's GeForce GTX series has traditionally used a recognizable design over the past two years, and with the exception to more angular corners, the GeForce GTX 650 Ti BOOST appears looks very similar to the GTX 570 and 670 models. Because GeForce GTX 650 Ti BOOST operates so efficiently, and allows nearly all of the heated air to exhaust outside of the computer case, the reference design does an excellent job for function. While fashionable looks might mean a lot to some consumers, keep in mind that this product outperforms the competition while generating much less heat and producing very little noise.

Construction is the one area NVIDIA graphics cards continually shine, and thanks in part to extremely quiet operation paired with more efficient cores that consume less energy and emit less heat, it seems that the 650 Ti BOOST continues this tradition. Requiring a single 6-pin PCI-E power connection helps ensure this video card remains compatible with mainstream power supply units, while tweaking heatsink and fan placement to optimize cooling performance proves there are still ways to improve on a commonplace technology. GeForce GTX 650 Ti BOOST has one of the shortest PCBs we've seen from a GTX-series model, which further reduces heat output and makes this a product suitable for more robust HTPC applications. Even better yet, now consumers have a single-GPU solution capable of driving three monitors in 3D Vision Surround with the inclusion of two DL-DVI ports with supplementary HDMI and DisplayPort output.

As of launch day, the NVIDIA GeForce GTX 650 Ti BOOST video card sells for $169.99 (Amazon | Newegg). Please keep in mind that hardware manufacturers and retailers are constantly adjusting prices, so expect it to change a few times between now and one month later. There's still plenty of value beyond basic frame rate performance, and the added NVIDIA Kepler features run it off the charts. Only NVIDIA video cards offer automated GPU Boost technology, 3D Vision, Adaptive VSync, PhysX technology, FXAA, and now TXAA.

In summary, the NVIDIA GTX 650 Ti BOOST video card targets mainstream gamers with a solid performing graphics solution that offers 2GB GDDR5 memory along with very fast clock speeds combined with GPU Boost technology. Ideally, this could become the go-to graphics card for under $170, giving enthusiasts enough overclock headroom to push it into GTX 660 territory while retaining the efficient temperatures and power consumption of the GTX 650 series. Since SLI is supported, adding a second video card to form a set once prices settle could effectively double performance for under $300. For mainstream gamers wanting to upgrade their aging hot-running power-hungry graphics card, the GeForce GTX 650 Ti BOOST is a value-packed option worth the money. Additionally, a 1GB GDDR5 version will debut next month for around $149. Benchmark Reviews recommends the GTX 650 Ti BOOST graphics card, and predict the series will earn popularity among gamers.

Pros:

+ Outperforms the AMD Radeon HD 6970 and GeForce GTX 570
+ Kepler GPU enables 3D Vision and PhysX functionality
+ Excellent performance with DX11 video games
+ Supports NVIDIA GPU Boost technology, Adaptive VSync, and TXAA
+ Short profile fits into standard size computer cases
+ Triple-display and 3D Vision Surround support
+ Cooling fan operates at very quiet acoustic levels
+ Features DisplayPort connectivity for future monitor technology
+ Very low power consumption and heat output
+ Upgradable into dual-card SLI set

Cons:

- No notable cons.

COMMENT QUESTION: Will the GeForce GTX 650 Ti BOOST graphics card series become as popular as we predict?


Related Articles:
 

Comments 

 
# Compare Coming soon...?WhyNot 2013-03-26 05:38
Will this testing information be used if a future review of the AMD Radeon HD 7790 is given? I realize that the 7790 is half the GGDR5 of the GTX 650 Ti, but given they share a similar price point it would be interesting to see if the extra 1GB GGDR5 (as well as any other differences in power consumption, temp, etc.) is worth the extra $20-30. Thanks.
Report Comment
 
 
# RE: Compare Coming soon...?Olin Coles 2013-03-26 07:33
Each writer has their own set of graphics cards and computer system to test with, so nobody else would be using my results to test products they receive. That being said, our test sample is supposedly 'delayed' because of last-minute fixes in the design. Hopefully we'll have something soon for comparison.
Report Comment
 
 
# RE: RE: Compare Coming soon...?WhyNot 2013-03-28 03:47
Thanks Olin. I look forward to that review as well.
Report Comment
 
 
# Really...Kirk Martin 2013-03-26 05:44
I do not think I have read so much crap in a review before... AMD has disappeared from the Scene! Wow I never knew this site had so many ANTI AMD fans on board.

I think you best read on here... ##engadget.com/2013/03/22/amd-radeon-hd-7790/

If you are going to continue to produce sub standard articles and post them, looks like I am going to have to go somewhere else for decent reviews.
Report Comment
 
 
# RE: Really...cosminmcm 2013-03-26 06:00
Go in peace!
Try semiaccurate, you will feel at home.
Report Comment
 
 
# RE: Really...Steven Iglesias-Hearst 2013-04-01 13:19
Read some of our AMD Video Card reviews and tell us again that we favor NVIDIA...
Report Comment
 
 
# RE: NVIDIA GeForce GTX 650 Ti BOOSTKirk Martin 2013-03-26 05:59
To everyone wanting to know how fast this card is to the NEW AMD HD 7790, This new Nvidia card is 20% slower avg in all games tested here.

##engadget.com/2013/03/22/amd-radeon-hd-7790/

The AMD card is also cheaper and has cross fire.

Biased reviews do no one any favors, they give out the wrong information and damage the market in the long run.

I cannot believe this review was allowed to go live, its pretty disgusting and low.
Report Comment
 
 
# Uh, noBrett 2013-03-26 06:40
There are a couple of problems with your post there.

First, you're quoting an announcement from AMD there, which is therefore not remotely unbiased.

Second, and more importantly, AMD are claiming their card to be 20% faster that the 650 Ti at 1080p, not the 650 Ti BOOST. If you want to assume AMD is actually correct, then compare accordingly. In this article the Ti BOOST being reviewed appears to be substantially more than 20% faster than the base Ti model.

As for his "AMD has all but disappeared from the scene" quote, yeah, that's certainly worthy of criticism.
Report Comment
 
 
# RE: RE: NVIDIA GeForce GTX 650 Ti BOOSTOlin Coles 2013-03-26 07:38
Congratulations! The article you've referenced compares the 7790 to the non-BOOST version of this card, which costs $30 less and has half the memory with much lower clock speeds... something I went into great detail explaining in this very article. Kudos for glossing over the very first page.
Report Comment
 
 
# MrAlan Wakefield 2013-03-27 03:00
"AMD has all but disappeared from the scene"

Really? Really?

As a graphics card review site, you have lost my trust.
Report Comment
 
 
# RE: MrOlin Coles 2013-03-27 07:36
Did someone just make a statement about AMD that you didn't like? Oh NO! You had better go find a website that says only the things you want to hear about your favorite company, because that's what builds trust! Or... take a look at the share price of AMD stock over the past year, then take a look at the success (or lack of) their product launches in that period, and finish it all off with a look at their future road map. Make sure to come back and rant once you're done, so that we have a written record of your delusion.
Report Comment
 
 
# MrAlan Wakefield 2013-03-28 06:35
Just for the record, I have owned both Nvidia & AMD cards.

I'm looking for Best performance/value, no matter who makes that card. In other words 'none biased'. Something that a Card reviewing site should definitely be, otherwise their results/benchmarks will always be 'suspect'.

For you to say "AMD has all but disappeared from the scene" (especially them having recently released new cards) is arguably one of the most ridiculous things I've heard & only comes across has showing up your own bias towards Nvidia..
Report Comment
 
 
# RE: MrOlin Coles 2013-03-28 09:02
Just for the record, AMD hasn't offered a new GPU design in quite some time. Adjusting clock speeds and memory amounts on an existing platform does not make it a new platform, merely a new version of an old product.
Report Comment
 
 
# mrAlan Wakefield 2013-03-28 10:32
So what if it is merely a new version of an old product, if it beats the competition on performance/value, it's getting my money.

Did it ever occur to you, that if a company has a product ahead of the competition, it would make little sense to bring out a new platform.....just sayin!

Again I must stress, I'm not a fanboi, I came here to read up on the 650 boost with the possibility of this being my next card. But I do expect you to acknowledge the competition.
Report Comment
 
 
# RE: mrOlin Coles 2013-03-28 10:45
I included benchmark results for all of the AMD video cards we've received. Since they've become more reluctant to sample products like they used to, we don't get as many samples. Still, I think that I did more than enough to 'acknowledge the competition'.
Report Comment
 
 
# RE: RE: mrAlan Wakefield 2013-03-28 11:11
So begs the question... why include AMD cards in the benchmarks, if, as you put it "AMD has all but disappeared from the scene"

It wouldn't be because they are a viable alternative, & haven't left the scene at all, would it?

Anyhow I'm not here to argue, it was an interesting article (that statement aside) so good luck in future articles & keep an 'open mind'.

cheers
Report Comment
 
 
# RE: RE: RE: mrOlin Coles 2013-03-28 11:13
Do you know the difference between "has disappeared from the scene" and "has all but disappeared from the scene"?
Report Comment
 
 
# The Competition for the Boost...Tangldweb 2013-04-02 17:25
...would be the 7850 from all the few reveiws out on this awesome card beings how new it is. And now, being the Proud Owner of the 650ti Boost in my new HP h9 i7 computer, I can say words like Ultra, FXAA, AND 2x MSAA in the same sentence,.. and be talkin $150 for 2g. And though this was an Ebay deal, the price will come down even more in the near future. As for Value, PLEASE,.. DO NOT READ the other reveiw I mentioned here if the 7790 is in your future because it Will be AMD. "A"m "M"ighty "D"isappointed.
from hardocp,...
Price - On price, the new NVIDIA GeForce GTX 650 Ti Boost has the advantage, unless you can find AMD video cards on sale or with rebates. The GTX 650 Ti Boost 1GB models will be sold for $149, the 2GB models will be sold for $169. Currently, the base price on Radeon HD 7850's are at $179 for the 1GB models and $185 for the 2GB models, and upwards above $200. Unless cards are offered with a rebate, the GTX 650 Ti Boost has the advantage. Compared to the Radeon HD 7790, the price is exactly the same for 1GB models at $149. However, the performance is certainly not, as explained below.

Performance - On performance, the new NVIDIA GeForce GTX 650 Ti Boost offers the same gameplay experience as the Radeon HD 7850. We found that the gameplay experience was the same in every game, but the performance itself was actually slightly faster with the GTX 650 Ti Boost. Compared to the new Radeon HD 7790, the gameplay experience was superior on the GeForce GTX 650 Ti in every single game. Plus, the performance itself just smashed the HD 7790.

Value Summary - The new GeForce GTX 650 Ti Boost competes not with the Radeon HD 7790, which was just announced days ago, but in fact it competes with AMD's next model up, the Radeon HD 7850. However, the GTX 650 Ti is priced at the Radeon HD 7790's level. Therefore what you get is Radeon HD 7850-like experience, for less money. The clear value is the new GeForce GTX 650 Ti Boost.
Sounds like Olin isn't the only one who is in nVidia's corner with this one,.. I know I am. Off to Battlefeild 3 !!! Hav Fun !!!
Report Comment
 
 
# RE: The Competition for the Boost...Kirk Martin 2013-04-03 12:30
I hope you enjoy it, however I do prefer AMD cards because they work with all Monitors and have no issues on setup... I tried to connect a old Goodmans X Pro monitor to a customers machine, just so he could use it until his new monitor came. However no matter what I did, the nvidia card would not apply the correct refresh rate required by the monitor. I fitted an AMD card, it worked first time, no issues with setup.

You can stick with Nvidia, and you can keep the abuse you give out, you have proven with your responses that you cannot handle any criticism about reviews that you have written. It is a big shame... I have seen your reviews for AMD hardware in the past.


Also makes me wonder if you should be hosting a review web site!
Report Comment
 
 
# To Quote Mr. Martin...Tangldweb 2013-04-04 18:38
... I think you best read on here, #hardocp.com/article/2013/03/26/nvidia_geforce_gtx_6 50_ti_boost_video_ca rd_review/1 At this point I'm wondering why someone so happy with their chosen product would be even looking at the comptition. Is it because you were hoping for a favorable outcome your way??? Well other than making personal assaults on the people here, maybe I can see a review of yours. Which card do you own in this review? How many other reviews have you read on the 650boot? How extensive is your background of nVidia? Kinda have to know "How" they work to "Make" them work Right. As well, I'm sure your AMD power hungry, room heating computers work well with AMD gpu's. My HP i7 with 650boost ran fine on my RCA 35inch CRT for setup before going in the Man Cave to the 55inch Plasma,.. and until you mentioned compatability here, it never entered my mind. Hey, your team lost this match. And though from my view they lost pretty badly, don't take it out on the wife and kids,.. or the Dog for that matter. There's always tomorrow.
Report Comment
 
 
# And I'll man up...Tangldweb 2013-04-04 19:10
...and tell you one Big Problem you may find with an i7 system and a 650boost,.. getting to use it. Really!!! I'm on the laptop with the Xbox while the family is watching "How to train your Dragon". We DO have a Blueray player people. LOL
Report Comment
 
 
# Love itJustin Ashburn 2013-03-27 19:46
Love the review, love the site and most of all, I love your comments Olin. Nice to see someone be true to themselves and stick up for their work instead of backing down to try to please every visitor.
Report Comment
 
 
# Are you people Slow,..Tangldweb 2013-03-29 17:29
...or just not understand that this is a Test !!! And though AMD was not the prossesor used either by the way, you test with things in that particular items area, even if it is an old "Inferior" product. As well, the ti-Boost is so new, that a dozen different drivers have no doubt already come out in just the time I've been typing let alone the fact we havent even Seen the retail mutations! And not to dis this site At All, if you don't like the opinion of this reveiw, try this one I read just before reading one,( #hardocp.com/article/2013/03/26/nvidia_geforce_gtx_6 50_ti_boost_video_ca rd_review/1 ) .. which pretty much has the same opinion,.. Smack Down of AMD's cards,.. And nVidia's Too!!! Truth Hurt's and information Rule's, and all the reveiws available show the same trend. And I'll bet money, (do you except US currency :), you just watch, this card will be the one the majority of cards in it's class are tested against,.. and you can hold the Cash!!! Cheers !!!
Report Comment
 
 
# Can Console Gaming Save AMD From Collapse?Olin Coles 2013-04-09 14:16
I've written an editorial that AMD fan boys might find enlightening:
/index.php?option=com_content&task=view&id=22572

Let me know if you still think I'm biased after reading it. :)
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews Archive