ASUS ENGTX560 Ti DCII TOP Video Card |
Reviews - Featured Reviews: Video Cards | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Written by Servando Silva | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Monday, 14 February 2011 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ASUS ENGTX560 Ti DirectCU II TOP Review
Manufacturer: ASUSTeK Computer Inc Full Disclosure: The product sample used in this article has been provided by ASUS. The GeForce GTX 560 Ti is the new boom in the "mid-end" sector. It replaces the highly overclock-able GTX 460 video card and adds extra MHz and cores increasing 20-40% performance in many games. As it could be expected, brands like ASUS, GIGABYTE or MSI will pair this GPU with a special heatsink-cooler and increase factory frequencies to sell it at a higher price. Benchmark Reviews tests the ASUS ENGTX560 Ti DCII TOP. DCII means this card now comes with an improved version of the Direct CU cooler, now with 2 fans and adding dissipation area. They also tag this product as TOP because it sports a 900MHz GPU core instead of the 822MHz from the reference design and 1050MHz instead of 1000MHz from stock. Aside from that, ASUS also bundles their very own overclocking/monitoring software to modify frequency and voltage values and add some extra performance for free. Let's have a look at the ASUS ENGTX560 Ti and check if it can be a good contender against the competition, including the highly acclaimed GeForce GTX 460. While it's difficult to say the GTX 560 Ti is good enough to replace the GTX 460 GPU, it might be because of the facts and the present conditions surrounding this launch. The GTX 460 is an excellent GPU, but the fact that all the rest of the Fermi 4x0 family was highly disappointing in terms of heat and power consumption was something that helped a lot to push the GTX 460 where it's currently positioned. It achieved great performance levels surpassing the HD5850 in many occasions, especially when enabling Tessellation and I assume something important was the superb overclocking capabilities which could render into a GPU faster than the HD 5870. SLI vs. CFX scaling performance was also a big deal, where many users preferred to buy a pair of GTX460 and put them in SLI than paying more for the GTX 480 or a CFX array of HD 5870s.
Right now the conditions are a little bit different. AMD launched their HD 6800 series and they compete very close below and ahead the GTX 460. They also had the 6900 series launched before the GTX 560 Ti, and even cared to launch the HD 6950 1GB edition the same day Nvidia announced the GTX 560 Ti. Even that, at a $249 MSRP, the GTX 560 Ti performs better that the HD 6870 which now sits at $219 and competes very close with the HD 6950 1GB Radeon video card ($270). The ENGTX 560 Ti DCII TOP model offers up a modest 9% factory overclock, but I'm guessing there's more where that came from. Benchmark Reviews takes full advantage of ASUS hardware and software upgrades, as we explore the full potential of NVIDIA's latest midrange marvel on our test bed. Nvidia GeForce GTX 560 Ti FeaturesNVIDIA's GeForce GTX 560 Ti introduces the new GF114 GPU that is largely based on the GF104 Fermi chip which drove the GTX 460 to great success. The differences are four fold: full utilization of the die (no disabled cores), architecture improvements, more widespread use of low-leakage transistors, and layout changes based on signal traffic analysis. While the GF104 enabled only seven out of eight possible Streaming Multiprocessors (SM), the GF114 is able to use that last SM to make even more cores available, a total of 384 compared to 336 in the GTX 460. Each SM still offers 48 CUDA cores, four dispatch units, and eight texture/special function units. The architecture improvements are the addition of full speed FP16 texture filtering and new tile formats that improve Z-cull efficiency. These enhancements alone offer performance improvements ranging from 5% on tessellation-heavy benchmarks like Heaven 2.1, to a whopping 14% gain in 3DMark Vantage, where it's all about shader power. The last two improvements go hand in hand to improve both the power usage and the maximum clock rates that the GPU can support. Low leakage transistors run cooler, use less power, and can be driven faster due to their lower gate capacitance. NVIDIA increased the usage of this more expensive device type, primarily to reduce power consumption, but also to gain some overclocking headroom. They also looked at signal flow across the various sections of the GPU and did some rearranging to shorten signal paths for the high traffic areas. It's not that the GTX 460 was particularly bad in this regard, but the luxury of a second chance yielded some improvements. Taken together, NVIDIA was able to increase the base clock on the core from 675 MHz to 822 MHz, a whopping 22% increase that supposedly doesn't eat into any overclocking headroom. We'll test that supposition later in the review. As for the rest of the capabilities of this very advanced graphics card, here is the complete list of GPU features, as supplied by NVIDIA: NVIDIA GeForce GTX 5xx GPU Feature Summary:3D Graphics
GPU Computing
NVIDIA Technology
GPU Interfaces
Advanced Display Functionality
Video
Digital Audio
Power Management Technology
NVIDIA GeForce GTX 560 Ti GPU Detail SpecificationsGPU Engine Specs:
Memory Specs:
Display Support:
Graphics card Dimensions:
Thermal and Power Specs:
Source:NVIDIA.com Closer Look: ASUS GTX 560 TiAs usual, ASUS packages its GPUs in a nice box with a warrior on it. Any technologies, features and bundles are displayed on the frontal and back side of the package. ASUS claims to use Super Alloy Power technology which uses some special alloy formula in critical power delivery components for a 15% performance boost, 35°C cooler operation and 2.5 times longer lifespan. It's hard not to notice the 900MHz GPU Core clock as it's as big as the ASUS logo. |
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
3DMark Vantage Performance Tests
3DMark Vantage is a PC benchmark suite designed to test the DirectX10 graphics card performance. FutureMark 3DMark Vantage is the latest addition the 3DMark benchmark series built by FutureMark Corporation. Although 3DMark Vantage requires NVIDIA PhysX to be installed for program operation, only the CPU/Physics test relies on this technology.
3DMark Vantage offers benchmark tests focusing on GPU, CPU, and Physics performance. Benchmark Reviews uses the two GPU-specific tests for grading video card performance: Jane Nash and New Calico. These tests isolate graphical performance, and remove processor dependence from the benchmark results.
- 3DMark Vantage v1.02
- Extreme Settings: (Extreme Quality, 8x Multisample Anti-Aliasing, 16x Anisotropic Filtering, 1:2 Scale)
3DMark Vantage GPU Test: Jane Nash
Of the two GPU tests 3DMark Vantage offers, the Jane Nash performance benchmark is slightly less demanding. In a short video scene the special agent escapes a secret lair by water, nearly losing her shirt in the process. Benchmark Reviews tests this DirectX-10 scene at 1680x1050 and 1920x1200 resolutions, and uses Extreme quality settings with 8x anti-aliasing and 16x anisotropic filtering. The 1:2 scale is utilized, and is the highest this test allows. By maximizing the processing levels of this test, the scene creates the highest level of graphical demand possible and sorts the strong from the weak.
Jane Nash Extreme Quality Settings
3DMark Vantage GPU Test: New Calico
New Calico is the second GPU test in the 3DMark Vantage test suite. Of the two GPU tests, New Calico is the most demanding. In a short video scene featuring a galactic battleground, there is a massive display of busy objects across the screen. Benchmark Reviews tests this DirectX-10 scene at 1680x1050 and 1920x1200 resolutions, and uses Extreme quality settings with 8x anti-aliasing and 16x anisotropic filtering. The 1:2 scale is utilized, and is the highest this test allows. Using the highest graphics processing level available allows our test products to separate themselves and stand out (if possible).
New Calico Extreme Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
DX10: Crysis Warhead
Crysis Warhead is an expansion pack based on the original Crysis video game. Crysis Warhead is based in the future, where an ancient alien spacecraft has been discovered beneath the Earth on an island east of the Philippines. Crysis Warhead uses a refined version of the CryENGINE2 graphics engine. Like Crysis, Warhead uses the Microsoft Direct3D 10 (DirectX-10) API for graphics rendering.
Benchmark Reviews uses the HOC Crysis Warhead benchmark tool to test and measure graphic performance using the Airfield 1 demo scene. This short test places a high amount of stress on a graphics card because of detailed terrain and textures, but also for the test settings used. Using the DirectX-10 test with Very High Quality settings, the Airfield 1 demo scene receives 4x anti-aliasing and 16x anisotropic filtering to create maximum graphic load and separate the products according to their performance.
Using the highest quality DirectX-10 settings with 4x AA and 16x AF, only the most powerful graphics cards are expected to perform well in our Crysis Warhead benchmark tests. DirectX-11 extensions are not supported in Crysis: Warhead, and SSAO is not an available option.
- Crysis Warhead v1.1 with HOC Benchmark
- Moderate Settings: (Very High Quality, 4x AA, 16x AF, Airfield Demo)
Crysis Warhead Moderate Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
Aliens vs. Predator Test Results
Aliens vs. Predator is a science fiction first-person shooter video game, developed by Rebellion, and published by Sega for Microsoft Windows, Sony PlayStation 3, and Microsoft Xbox 360. Aliens vs. Predator utilizes Rebellion's proprietary Asura game engine, which had previously found its way into Call of Duty: World at War and Rogue Warrior. The self-contained benchmark tool is used for our DirectX-11 tests, which push the Asura game engine to its limit.
In our benchmark tests, Aliens vs. Predator was configured to use the highest quality settings with 4x AA and 16x AF. DirectX-11 features such as Screen Space Ambient Occlusion (SSAO) and tessellation have also been included, along with advanced shadows.
- Aliens vs. Predator
- Extreme Settings: (Very High Quality, 4x AA, 16x AF, SSAO, Tessellation, Advanced Shadows)
Aliens vs. Predator Extreme Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
DX11: BattleForge Results
BattleForge is free Massive Multiplayer Online Role Playing Game (MMORPG) developed by EA Phenomic with DirectX-11 graphics capability. Combining strategic cooperative battles, the community of MMO games, and trading card gameplay, BattleForge players are free to put their creatures, spells and buildings into combination's they see fit. These units are represented in the form of digital cards from which you build your own unique army. With minimal resources and a custom tech tree to manage, the gameplay is unbelievably accessible and action-packed.
Benchmark Reviews uses the built-in graphics benchmark to measure performance in BattleForge, using Very High quality settings (detail) and 8x anti-aliasing with auto multi-threading enabled. BattleForge is one of the first titles to take advantage of DirectX-11 in Windows 7, and offers a very robust color range throughout the busy battleground landscape. The charted results illustrate how performance measures-up between video cards when Screen Space Ambient Occlusion (SSAO) is enabled.
- BattleForge v1.2
- Extreme Settings: (Very High Quality, 8x Anti-Aliasing, Auto Multi-Thread)
EDITOR'S NOTE: AMD is aware of performance concerns with BattleForge, and offered us an official response:
"We are aware that there are some abnormal performance results in BattleForge with our new AMD Radeon HD 6900 Series graphics card. Keep in mind this is a new VLIW4 shader architecture and we are still fine tuning the shader compilation. We will be able to post a hotfix for Battleforge shortly that will provide a noticeable increase in performance."
BattleForge Extreme Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
Just Cause 2 Performance Tests
"Just Cause 2 sets a new benchmark in free-roaming games with one of the most fun and entertaining sandboxes ever created," said Lee Singleton, General Manager of Square Enix London Studios. "It's the largest free-roaming action game yet with over 400 square miles of Panaun paradise to explore, and its 'go anywhere, do anything' attitude is unparalleled in the genre." In his interview with IGN, Peter Johansson, the lead designer on Just Cause 2 said, "The Avalanche Engine 2.0 is no longer held back by having to be compatible with last generation hardware. There are improvements all over - higher resolution textures, more detailed characters and vehicles, a new animation system and so on. Moving seamlessly between these different environments, without any delay for loading, is quite a unique feeling."Just Cause 2 is one of those rare instances where the real game play looks even better than the benchmark scenes. It's amazing to me how well the graphics engine copes with the demands of an open world style of play. One minute you are diving through the jungles, the next you're diving off a cliff, hooking yourself to a passing airplane, and parasailing onto the roof of a hi-rise building. The ability of the Avalanche Engine 2.0 to respond seamlessly to these kinds of dramatic switches is quite impressive. It's not DX11 and there's no tessellation, but the scenery goes by so fast there's no chance to study it in much detail anyway.
Although we didn't use the feature in our testing, in order to equalize the graphics environment between NVIDIA and ATI, the GPU water simulation is a standout visual feature that rivals DirectX 11 techniques for realism. There's a lot of water in the environment, which is based around an imaginary Southeast Asian island nation, and it always looks right. The simulation routines use the CUDA functions in the Fermi architecture to calculate all the water displacements, and those functions are obviously not available when using an ATI-based video card. The same goes for the Bokeh setting, which is an obscure Japanese term for out-of-focus rendering. Neither of these techniques uses PhysX, but they do use specific computing functions that are only supported by NVIDIA's proprietary CUDA architecture.
There are three scenes available for the in-game benchmark, and I used the last one, "Concrete Jungle" because it was the toughest and it also produced the most consistent results. That combination made it an easy choice for the test environment. All Advanced Display Settings were set to their highest level, and Motion Blur was turned on, as well.
Just Cause 2 Concrete Jungle Benchmark High Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
Lost Planet 2 DX11 Benchmark Results
Lost Planet 2 is the second installment in the saga of the planet E.D.N. III, ten years after the story of Lost Planet: Extreme Condition. The snow has melted and the lush jungle life of the planet has emerged with angry and luscious flora and fauna. With the new environment comes the addition of DirectX-11 technology to the game.
Lost Planet 2 takes advantage of DX11 features including tessellation and displacement mapping on water, level bosses, and player characters. In addition, soft body compute shaders are used on 'Boss' characters, and wave simulation is performed using DirectCompute. These cutting edge features make for an excellent benchmark for top-of-the-line consumer GPUs.
The Lost Planet 2 benchmark offers two different tests, which serve different purposes. This article uses tests conducted on benchmark B, which is designed to be a deterministic and effective benchmark tool featuring DirectX 11 elements.
- Lost Planet 2 Benchmark 1.0
- Moderate Settings: (2x AA, Low Shadow Detail, High Texture, High Render, High DirectX 11 Features)
Lost Planet 2 Moderate Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
DX11: Metro 2033
Metro 2033 is an action-oriented video game with a combination of survival horror, and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in Ukraine and released in March 2010 for Microsoft Windows. Metro 2033 uses the 4A game engine, developed by 4A Games. The 4A Engine supports DirectX-9, 10, and 11, along with NVIDIA PhysX and GeForce 3D Vision.
The 4A engine is multi-threaded in such that only PhysX had a dedicated thread, and uses a task-model without any pre-conditioning or pre/post-synchronizing, allowing tasks to be done in parallel. The 4A game engine can utilize a deferred shading pipeline, and uses tessellation for greater performance, and also has HDR (complete with blue shift), real-time reflections, color correction, film grain and noise, and the engine also supports multi-core rendering.
Metro 2033 featured superior volumetric fog, double PhysX precision, object blur, sub-surface scattering for skin shaders, parallax mapping on all surfaces and greater geometric detail with a less aggressive LODs. Using PhysX, the engine uses many features such as destructible environments, and cloth and water simulations, and particles that can be fully affected by environmental factors.
NVIDIA has been diligently working to promote Metro 2033, and for good reason: it's one of the most demanding PC video games we've ever tested. When their flagship GeForce GTX 480 struggles to produce 27 FPS with DirectX-11 anti-aliasing turned two to its lowest setting, you know that only the strongest graphics processors will generate playable frame rates. All of our tests enable Advanced Depth of Field and Tessellation effects, but disable advanced PhysX options.
- Metro 2033
- Moderate Settings: (Very-High Quality, AAA, 16x AF, Advanced DoF, Tessellation, Frontline Scene)
Metro 2033 Moderate Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
Unigine Heaven 2.1 Benchmark
The Unigine Heaven 2.1 benchmark is a free publicly available tool that grants the power to unleash the graphics capabilities in DirectX-11 for Windows 7 or updated Vista Operating Systems. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. With the interactive mode, emerging experience of exploring the intricate world is within reach. Through its advanced renderer, Unigine is one of the first to set precedence in showcasing the art assets with tessellation, bringing compelling visual finesse, utilizing the technology to the full extend and exhibiting the possibilities of enriching 3D gaming.
The distinguishing feature in the Unigine Heaven benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces, so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality transcends conjured by your hand.
Although Heaven-2.1 was recently released and used for our DirectX-11 tests, the benchmark results were extremely close to those obtained with Heaven-1.0 testing. Since only DX11-compliant video cards will properly test on the Heaven benchmark, only those products that meet the requirements have been included.
- Unigine Heaven Benchmark 2.1
- Extreme Settings: (High Quality, Normal Tessellation, 16x AF, 4x AA
Heaven 2.1 Moderate Quality Settings
Graphics Card | GeForce GTX460 | Radeon HD6850 | Radeon HD6870 | GeForce GTX560 Ti |
GPU Cores | 336 | 960 | 1120 | 384 |
Core Clock (MHz) | 675 | 775 | 900 | 900 |
Shader Clock (MHz) | 1350 | N/A | N/A | 1800 |
Memory Clock (MHz) | 900 | 1000 | 1050 | 1050 |
Memory Amount | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 | 1024MB GDDR5 |
Memory Interface | 256-bit | 256-bit | 256-bit | 256-bit |
ASUS ENGTX560 Ti DCII TOP Temperatures
Benchmark tests are always nice, so long as you care about comparing one product to another. But when you're an overclocker, gamer, or merely a PC hardware enthusiast who likes to tweak things on occasion, there's no substitute for good information. Benchmark Reviews has a very popular guide written on Overclocking Video Cards, which gives detailed instruction on how to tweak a graphics cards for better performance. Of course, not every video card has overclocking head room. Some products run so hot that they can't suffer any higher temperatures than they already do. This is why we measure the operating temperature of the video card products we test.
To begin my testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark's "Torture Test" to generate maximum thermal load and record GPU temperatures at high-power 3D mode. FurMark does two things extremely well: drive the thermal output of any graphics processor much higher than any video games realistically could, and it does so with consistency every time. Furmark works great for testing the stability of a GPU as the temperature rises to the highest possible output. During all tests, the ambient room temperature remained at a stable 20°C. The temperatures discussed below are absolute maximum values, and may not be representative of real-world temperatures while gaming:
Load |
Fan Speed |
GPU Temperature |
Idle |
17% - AUTO (1380rpm) |
32C |
Furmark |
50% - AUTO (3200rpm) |
83C |
Furmark |
100% - Manual (4620rpm) |
73C |
Even if the GPU carries 2x 80mm fans, speeds can be read with MSI Afterburner without problems. At idle mode, the GPU is very silent. The fans rotate at 1380rpm (17%) keeping the temperature at merely 32 degrees. However, once I launched Furmark temps sky-rocketed to 83 degrees at auto mode. This resulted into a very noisy setup with both fans running at 3200rpm (50% is what AB reports). Manually setting the GPU fans at 100% resulted into 4620rpm. At that speed, temps didn't pass over 73 degrees, which is still very high for the GTX 560 Ti, especially if you're paying some extra money for a decent cooler.
When I started gaming with some demanding titles the temperatures were much lower. After running Unigine's Heaven Benchmark for 30 minutes, the GPU core barely reached 75 degrees, which isn't that great for overclocking. Any other game like Metro 2033 or Crysis produced less heat, barely passing 70 degrees. At this point, I'm very disappointed with ASUS DirectCU II cooler's design. They really achieved to sell a nearly silent GPU at Idle mode, but it really fails at load conditions. I was expecting something more like the MSI GTX 560 Ti Twin Frozr design, which keeps the GPU Core below 60 degrees. Instead, ASUS ships a GPU which barely performs as the Nvidia's Reference Cooler; it just looks better.
ASUS ENGTX560 Ti DCII TOP Overclocking
When it comes to overclocking I usually get excited and try many things to achieve the best solid overclock with the GPU, especially if it's known to be a good one, which is the case of the GTX 560 Ti GPUs. Now that we've voltage control over many GPUs via software applications like MSI Afterburner, the only thing we need to keep in mind is heat. Since this is an already overclocked card, I'm not expecting to see that much of a change. I quickly installed the latest version of MSI Afterburner (2.1.7) to start doing some 3DMark damage at the orb.
It turns out that if you have a so-so cooler design, you can't achieve superb clocks. While I reached 970MHz fully stable at Furmark and benchmarks with just 1075mV, I couldn't reach the magic 1GHz number; not even increasing Core voltage up to 1150mV (which is the limit with MSI Afterburner. In fact, if I increased voltages, temperatures got worst and then I had more problems to stabilize the GPU at the same frequency. Keep in mind that this is still an 18% OC from Nvidia's reference model, and it leads to better gaming performance. Temperatures also increased by 2-3 degrees, but nothing to be afraid of (more than I was after looking at those ugly results up there).
VGA Power Consumption
Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow-capped poles quickly turning brown, the technology industry has a new attitude towards turning "green". I'll spare you the powerful marketing hype that gets sent from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now. Take a look at the idle clock rates that AMD programmed into the BIOS for this GPU; no special power-saving software utilities are required.
The GTX 560 Ti works at 50.6/101/67.5MHz in idle mode, while VDDC lowers to 0.950 volts. At full load it increases frequencies to 900/1050 and VDDC goes up to 1.025 volts. The good part is that the GTX 500 series can be overclocked while keeping idle frequencies, saving some energy and keeping lower temperatures. Just to make sure there's no mistake: non-OC GTX 560 models increase their Load voltage up to 1.00v instead of 1.025v. For reference, my sample successfully worked at 1.00v load voltage while keeping GPU frequencies at 900/1050MHz.
To measure isolated video card power consumption, I used the Kill-A-Watt EZ (model P4460) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:
VGA Product Description(sorted by combined total power) |
Idle Power |
Loaded Power |
---|---|---|
NVIDIA GeForce GTX 480 SLI Set |
82 W |
655 W |
NVIDIA GeForce GTX 590 Reference Design |
53 W |
396 W |
ATI Radeon HD 4870 X2 Reference Design |
100 W |
320 W |
AMD Radeon HD 6990 Reference Design |
46 W |
350 W |
NVIDIA GeForce GTX 295 Reference Design |
74 W |
302 W |
ASUS GeForce GTX 480 Reference Design |
39 W |
315 W |
ATI Radeon HD 5970 Reference Design |
48 W |
299 W |
NVIDIA GeForce GTX 690 Reference Design |
25 W |
321 W |
ATI Radeon HD 4850 CrossFireX Set |
123 W |
210 W |
ATI Radeon HD 4890 Reference Design |
65 W |
268 W |
AMD Radeon HD 7970 Reference Design |
21 W |
311 W |
NVIDIA GeForce GTX 470 Reference Design |
42 W |
278 W |
NVIDIA GeForce GTX 580 Reference Design |
31 W |
246 W |
NVIDIA GeForce GTX 570 Reference Design |
31 W |
241 W |
ATI Radeon HD 5870 Reference Design |
25 W |
240 W |
ATI Radeon HD 6970 Reference Design |
24 W |
233 W |
NVIDIA GeForce GTX 465 Reference Design |
36 W |
219 W |
NVIDIA GeForce GTX 680 Reference Design |
14 W |
243 W |
Sapphire Radeon HD 4850 X2 11139-00-40R |
73 W |
180 W |
NVIDIA GeForce 9800 GX2 Reference Design |
85 W |
186 W |
NVIDIA GeForce GTX 780 Reference Design |
10 W |
275 W |
NVIDIA GeForce GTX 770 Reference Design |
9 W |
256 W |
NVIDIA GeForce GTX 280 Reference Design |
35 W |
225 W |
NVIDIA GeForce GTX 260 (216) Reference Design |
42 W |
203 W |
ATI Radeon HD 4870 Reference Design |
58 W |
166 W |
NVIDIA GeForce GTX 560 Ti Reference Design |
17 W |
199 W |
NVIDIA GeForce GTX 460 Reference Design |
18 W |
167 W |
AMD Radeon HD 6870 Reference Design |
20 W |
162 W |
NVIDIA GeForce GTX 670 Reference Design |
14 W |
167 W |
ATI Radeon HD 5850 Reference Design |
24 W |
157 W |
NVIDIA GeForce GTX 650 Ti BOOST Reference Design |
8 W |
164 W |
AMD Radeon HD 6850 Reference Design |
20 W |
139 W |
NVIDIA GeForce 8800 GT Reference Design |
31 W |
133 W |
ATI Radeon HD 4770 RV740 GDDR5 Reference Design |
37 W |
120 W |
ATI Radeon HD 5770 Reference Design |
16 W |
122 W |
NVIDIA GeForce GTS 450 Reference Design |
22 W |
115 W |
NVIDIA GeForce GTX 650 Ti Reference Design |
12 W |
112 W |
ATI Radeon HD 4670 Reference Design |
9 W |
70 W |
The ENGTX560 Ti DCII TOP pulled just 19 (91-72) watts at idle and 219 (291-72) watts when running full out, using the test method outlined above. It seems like the ASUS ENGTX560 Ti consumes more power than reference's board, but that's logical as it powers 2x 80mm fans and works at 900MHz with 25 extra milli-Volts to do the job. Consider PSU efficiency into the equations as I'm using an 80 plus bronze power supply. We've become used to the low power ways of the newest processors, and there's no turning back.
I'll offer you some of final thoughts and conclusion on the next pages...
ASUS ENGTX560 Ti DCII TOP Conclusion
IMPORTANT: Although the rating and final score mentioned in this conclusion are made to be as objective as possible, please be advised that every author perceives these factors differently at various points in time. While we each do our best to ensure that all aspects of the product are considered, there are often times unforeseen market conditions and manufacturer changes which occur after publication that could render our rating obsolete. Please do not base any purchase solely on our conclusion, as it represents our product rating specifically for the product tested which may differ from future versions. Benchmark Reviews begins our conclusion with a short summary for each of the areas that we rate.
From a performance standpoint, the GTX 560 Ti exceeds my expectations. I'm a little jaded I guess about manufacturer's claims, but NVIDIA didn't pull any punches with this update. The GTX460 is already a widely accepted success in the marketplace, and the GTX 560 Ti is just better. In many games, there's a performance increase up to 40-50% coming from the GTX 460. I'm sure the 80MHz factory OC is involved right here, but still it's a pretty good solution. I'm just sad about ASUS DCII cooling solution. I did try re-installing and checking if the heatsink was doing enough pressure against the GPU Core. I even re-applied thermal paste but results were the same. How many times you get a pretty good looking cooler that performs like the common and simple reference design? At the other hand, power requirements are very modest, as NVIDIA recommends a 500W PSU, which is actually below the minimum I would personally consider for any modern gaming rig.
The appearance of the ENGTX560 Ti DCII TOP video card is very attractive; the stylish fan shroud gives it a sports look mixed with those red stripes on it. The black matted PCB and rounded edges along with that side bar to avoid PCB warping just helps even more. There are also 3 ASUS logos in the GPU which can be visible through any windowed PC chassis. ASUS did a nice job producing a subtle design that is business-like, yet manages to show off its muscles at the same time. Too bad the heatsink doesn't really perform as well as it looks.
The build quality of the ENGTX560 Ti DCII TOP was quite good. Everything is assembled well, everything fit when I put it back together, and the overall impression of the card was very solid. The cooler along with the PCB and small heatsinks certainly adds a heft to the card and also lends a good deal of rigidity to the package. The packaging was of the highest quality and very informative. The upgraded power supply arrangement with super-alloy-power components give a good impression, and the not-so-crowded design at the back of the PCB means ASUS has been working in the PCB design lately. Even at overclocked conditions the GPU felt very solid, and it never stumbled or complained once.
The basic features of the ENGTX560 Ti DCII TOP are fully comparable with the latest offerings from both camps. It has: Microsoft DirectX 11 Support, PhysX Technology, is 3D Vision Ready, also 3D Vision Surround Ready, CUDA Technology, SLI, 32x Anti-aliasing, PureVideo HD, and HDMI 1.4a support. If PhysX and 3D Vision Surround matter to you, then you are already firmly anchored in the NVIDIA camp, and the GTX 560 Ti is just icing on the cake. Comparing that with AMD's offerings, I think Nvidia wins this match. I still however think Nvidia should design a card to fully support Surround mode in a single solution (without SLI) as AMD does with their Eyefinity GPUs.
As of late January 2011, the price for the ENGTX560 Ti is $249.99 at Newegg. There are currently no rebates available and ASUS is not giving away any popular games at the moment, so consider that in your purchasing decisions. The price-to-performance ratio of this GPU is so good, there's not a lot of downside anywhere. This particular model offers DirectCU II cooling system, which works great at idle mode but doesn't impress at load mode. Still, I think it looks better that Nvidia's reference design. AMD just issued their challenge, in the form of new 1GB versions of the HD 6950 that are priced very aggressively, and I look forward to comparing that new competitor in the near future.
Almost any GTX 560 Ti card is going to get high marks at this stage of the game. NVIDIA has brought some pretty amazing performance improvements to a graphics platform that was already very competitive. AMD has responded with some serious price cuts on the HD 6870 ($219) and released a value version of the HD 6950 ($269) that will bracket the GTX560Ti in price, but as of today I think this is the card to beat in the $250 price range. One reason for that is the continued presence of serious overclocking headroom for this upgraded GPU. I got 970 MHz on the core clock with very little effort.
Pros:
+ Lower temps than reference designs at Idle mode
+ Performance improvement over GTX 460 is impressive (20-50%)
+ Overclocking headroom is similar to the GTX 460
+ PhysX capability is great feature with minimal FPS impact
+ Upgraded power supply design with high quality components
+ Low price penalty for enhanced performance and features
+ Manufacturing quality is close to the very top
+ Industry leading 3D support by NVIDIA
Cons:
- Hot air from GPU cooler stays inside case
- DirectCU II cooler's performance was way below expectations
Ratings:
- Performance: 9.25
- Appearance: 9.25
- Construction: 9.00
- Functionality: 9.50
- Value: 9.00
Final Score: 9.20 out of 10.
Excellence Achievement: Benchmark Reviews Golden Tachometer Award
What do you think of the ASUS ENGTX560 Ti DCII TOP Video Card? Leave your comment below or ask questions in our Discussion Forum.
Related Articles:
- OCZ Summit MLC SSD OCZSSD2-1SUM120G
- PNY XLR8 GeForce GTX 670 Video Card
- Sapphire Radeon HD 4870 Toxic Video Card
- OCZ Reaper OCZ2RPR10664GK DDR2 Memory Kit
- QNAP TS-119 Gigabit NAS Server
- Cooler Master SNA 95 RP-095-D19A-A1
- QNAP NVR-1012 Wireless Network Surveillance Kit
- Mad Catz Tritton Pro+ 5.1 Surround Headset
- Cyber Acoustics AC-401 Gaming Headset
- ASUS Crosshair IV Formula AMD-890FX Motherboard
Comments
(be back after I read this)
Shame, I like ASUS normally.
Hope this helps, I have this card and it's plain awesome :D
wtf do you really expect?
a new gpu and mem with the same name on the card?
it is the same card with a new bios and higher clocks no TOP written anywhere but the packaging
hmmm how can we sell the left over stock and still make a prophet
My PC is amazing, bet it is better than yours.
I do have a major headache though. When I run CPU-Z or ASUS smart doctor, it says the GPU clock is only running at 830MHz and the shader at 1660MHz.
Anyone knows if it will underclock itself in idle mode?
I am getting desperate because it IS the ?TOP? version of the card. And I can?t find an answer anywhere.
I haven?t had the time to test if with furmark or anything like that.
I really hope that?s the case with this card as well!
I haven?t been able to find that info anywhere else.......
I was really bumbed out, because it seemed like I payed for the "TOP" version of the card, but only got the stock one.
But I don?t get it, if that?s the case. Why do they even make a "top" version of the card?
If the stock version will overclock itself as well.
So the card I received is the stock version, even though the box says "TOP"
In idle mode it should underclock far more than that.
35% fan speed: inaudible
40%: humming
50%: high pitch humming/whining
65-100%: HIGH PITCHED SCREAMING
Got a Gigabyte OC card instead. 900mhz also. Don't hear it untill 60% and even then it's just an air-noise as opposed to screeching and whining
-low temp in idle and not impressing temp in full or stress i expected lower because of my case antec 900
-with my e8500 proc i'm pleased by her performance
-is relatively easy to clean her of dust
-as usual the case of this vidcard cannot be removed for a better cleanup
-it not inspire me a solid construction because the pipes are not rigidized with card enough at jonction with uppercooler