| XFX Radeon HD5750 Video Card HD-575X-ZNF7 |
| Reviews - Featured Reviews: Video Cards | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Written by Bruce Normann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Wednesday, 21 October 2009 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
XFX Radeon HD5750The launch of Radeon HD57xx Juniper-GPU series cards is going very smoothly. ATI learned some hard lessons when they launched the HD4850 a couple years back. All the partners seem to have their cards ready for distribution this time, and there's no price gouging, due to the stable supply. This is doubly important for the HD57xx, since they're in the middle of the pack, performance-wise, and there are lots of competitors. XFX is one of the premium retail partners in the video card industry, although they're a relative newcomer to the ATI camp, and they've supplied Benchmark Reviews a model HD-575X-ZNF7 Radeon HD5750 to review. We recently looked at an early engineering sample of the HD5770, now we have the opportunity to take a look at a production version of the lower priced companion card, the XFX Radeon HD5750. We already know it's not going to challenge the HD5770, but can it beat out its real competition at the lower price point?
These mid-range cards compete in a much more crowded market with a lot more competitors overlapping into their performance and price zones. It's much more difficult to hit the bulls-eye in a market teaming with old standards and new stars, and of course, it's not a static target. Every day, the market shifts; sometime imperceptibly, sometimes radically. The target has been a bit jumpy these last few months, so let's see where this arrow lands. About the company: XFX
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
-
MSI Radeon HD4830 (R4830 T2D512 - Catalyst 8.66.6_Beta1)
-
ASUS Radeon HD4850 (EAH4850 TOP - Catalyst 8.66.6_Beta1)
-
XFX Radeon HD5750 (HD-575X-ZN Catalyst 8.66.6_Beta1)
-
ATI Radeon HD5770 (ATI Radeon HD5770 - Catalyst 8.66.6_Beta1)
-
ASUS GeForce GTX 260 (ENGTX260 MATRIX - Forceware v190.62)
-
MSI GeForce GTX 275 (N275GTX Twin Frozr OC - Forceware v190.62)
Now we're ready to begin testing video game performance these video cards, so please continue to the next page as we start with the 3DMark Vantage results.
3DMark Vantage Benchmark Results
3DMark Vantage is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 10 performance of 3D game performance with graphics cards. A 3DMark score is an overall measure of your system's 3D gaming capabilities, based on comprehensive real-time 3D graphics and processor tests. By comparing your score with those submitted by millions of other gamers you can see how your gaming rig performs, making it easier to choose the most effective upgrades or finding other ways to optimize your system.
There are two graphics tests in 3DMark Vantage: Jane Nash (Graphics Test 1) and New Calico (Graphics Test 2). The Jane Nash test scene represents a large indoor game scene with complex character rigs, physical GPU simulations, multiple dynamic lights, and complex surface lighting models. It uses several hierarchical rendering steps, including for water reflection and refraction, and physics simulation collision map rendering. The New Calico test scene represents a vast space scene with lots of moving but rigid objects and special content like a huge planet and a dense asteroid belt.
At Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, 3DMark is a reliable tool for comparing graphic cards against one-another.
1680x1050 is rapidly becoming the new 1280x1024. More and more widescreen are being sold with new systems or as upgrades to existing ones. Even in tough economic times, the tide cannot be turned back; screen resolution and size will continue to creep up. Using this resolution as a starting point, the maximum settings were applied to 3DMark Vantage include 8x Anti-Aliasing, 16x Anisotropic Filtering, all quality levels at Extreme, and Post Processing Scale at 1:2.
The two test scenes in 3DMark Vantage provide a varied and modern set of challenges for the video cards and their subsystems, as described above. The results always produced higher frame rates for GT1 and so far, I haven't seen any curveball results like I used to see with 3DMark06. The XFX Radeon HD5750 basically equaled the performance of an overclocked (ASUS TOP series) HD4850 card in both GT1 and GT2. In both test cases, the HD5750 easily beat an HD4830. The HD5770 and GTX260 are in another league from the HD5750, though. There's no pretending that it's close; the extra stream processors in the HD5770 really do make a difference. The GTX275 pulls far away from the middle of the pack, as it should for the price difference.
At a higher screen resolution, 1920x1200, the story changes a bit, as the HD5750 pulls out a two FPS lead on the HD4850. I know two FPS doesn't sound like much, but it's a 25% increase over the performance of the HD4850, so it's nothing to sneeze at. The HD5750 doesn't get any closer to the HD5770 or the GTX260, though. The 128-bit memory bus doesn't seem to hurt the card with higher resolutions. Just like the HD5770, the HD4850 beats the older, lower spec HD48xx series cards, but it doesn't blow them out of the water, and wouldn't be as much of an upgrade for Radeon users with cards that are 1-2 years old. We need to look at actual gaming performance to verify that, so let's take a look in the next section, at how these cards stack up in the standard bearer for gaming benchmarks, Crysis.
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
Crysis Benchmark Results
Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX 10) framework, but can also run using DirectX 9, on Vista, Windows XP and the new Windows 7. As we'll see, there are significant frame rate reductions when running Crysis in DX10. It's not an operating system issue, DX9 works fine in WIN7, but DX10 knocks the frame rates in half.
Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.
Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.
Low-resolution testing allows the graphics processor to plateau its maximum output performance, and shifts demand onto the other system components. At the lower resolutions Crysis will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, but it is sometimes helpful in creating a baseline for measuring maximum output performance. At the 1280x1024 resolution used by 17" and 19" monitors, the CPU and memory have too much influence on the results to be used in a video card test. At the widescreen resolutions of 1680x1050 and 1900x1200, the performance differences between video cards under test are mostly down to the cards.
In my review of the HD5770, I said I was shocked by these numbers, and nothing has changed. Running XP-based systems and DirectX 9, the latest generation of video cards were starting to get a handle on Crysis. Certainly, in this test, with no anti-aliasing dialed in, any of the tested cards running in DX9 provided a usable solution. Now only the highest performing boards get close to an average frame rate of 30FPS. It seems like we've gone back in time, back to when only two or three video cards could run Crysis with all the eye candy turned on. Now, we'll have to wait until CryEngine3 comes out, and is optimized for the current generation of graphics APIs.
The results here are a bit disheartening, in that the HD5750 actually gets owned by the older DX9 era HD4850, albeit an overclocked version. We might be able to make the best out of a bad situation by overclocking the HD5750 to even up the match a bit, but the fact is, they are roughly equal at stock settings. Compared to the HD4830, there's not a big enough jump to justify upgrading if you want to run this game in DirectX 10. This may not be a universal problem, we'll have to see, later on.
Once a decent amount of anti-aliasing is factored in, the HD5750 pulls up its bootstraps and moves ahead of the HD4850 a bit. All those little improvements ATI made to the rendering processor pay off here. It's especially noticeable at the higher resolution. Frame rates are still well below acceptable until you get to the high end cards. If you want to play this game in DX10, you are going to have to pay the man...
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
In our next section, Benchmark Reviews tests with Devil May Cry 4 Benchmark. Read on to see how a blended high-demand GPU test with low video frame buffer demand will impact our test products.
Devil May Cry 4 Benchmark
Devil May Cry 4 was released for the PC platform in early 2007 as the fourth installment to the Devil May Cry video game series. DMC4 is a direct port from the PC platform to console versions, which operate at the native 720P game resolution with no other platform restrictions. Devil May Cry 4 uses the refined MT Framework game engine, which has been used for many popular Capcom game titles over the past several years.
MT Framework is an exclusive seventh generation game engine built to be used with games developed for the PlayStation 3 and Xbox 360, and PC ports. MT stands for "Multi-Thread", "Meta Tools" and "Multi-Target". Originally meant to be an outside engine, but none matched their specific requirements in performance and flexibility. Games using the MT Framework are originally developed on the PC and then ported to the other two console platforms.
On the PC version a special bonus called Turbo Mode is featured, giving the game a slightly faster speed, and a new difficulty called Legendary Dark Knight Mode is implemented. The PC version also has both DirectX 9 and DirectX 10 mode for Windows XP, Vista, and Widows 7 operating systems.
It's always nice to be able to compare the results we receive here at Benchmark Reviews with the results you test for on your own computer system. Usually this isn't possible, since settings and configurations make it nearly difficult to match one system to the next; plus you have to own the game or benchmark tool we used.
Devil May Cry 4 fixes this, and offers a free benchmark tool available for download. Because the DMC4 MT Framework game engine is rather low-demand for today's cutting edge video cards, Benchmark Reviews uses the 1920x1200 resolution to test with 8x AA (highest AA setting available to Radeon HD video cards) and 16x AF.
Devil May Cry 4 is not as demanding a benchmark as it used to be. Only scene #2 and #4 are worth looking at from the standpoint of trying to separate the fastest video cards from the slower ones. Still, it represents a typical environment for many games that our readers still play on a regular basis, so it's good to see what works with it and what doesn't. Any of the tested cards will do a credible job in this application, and the performance scales in a pretty linear fashion. You get what you pay for when running this game, at least for benchmarks. This is one time where you can generally use the maximum available anti-aliasing settings, so NVIDIA users should feel free to crank it up to 16X. The DX10 "penalty" is of no consequence here.
The HD5750 once again loses out to the HD4850 and falls far behind the HD5770 and the GTX260. They all provide excellent frame rates, however, well above the recommended minimums. The surprise of this test is the excellent performance of both the HD4850 and the HD4830. There's something about those two old soldiers that just loves this game. Suffice it to say, if you are getting 60+ frames per second in all your video games, you don't need a video card upgrade.
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
Our next benchmark of the series is for a very popular FPS game that rivals Crysis for world-class graphics.
Far Cry 2 Benchmark Results
Ubisoft has developed Far Cry 2 as a sequel to the original, but with a very different approach to game play and story line. Far Cry 2 features a vast world built on Ubisoft's new game engine called Dunia, meaning "world", "earth" or "living" in Farci. The setting in Far Cry 2 takes place on a fictional Central African landscape, set to a modern day timeline.
The Dunia engine was built specifically for Far Cry 2, by Ubisoft Montreal development team. It delivers realistic semi-destructible environments, special effects such as dynamic fire propagation and storms, real-time night-and-day sun light and moon light cycles, dynamic music system, and non-scripted enemy A.I actions.
The Dunia game engine takes advantage of multi-core processors as well as multiple processors and supports DirectX 9 as well as DirectX 10. Only 2 or 3 percent of the original CryEngine code is re-used, according to Michiel Verheijdt, Senior Product Manager for Ubisoft Netherlands. Additionally, the engine is less hardware-demanding than CryEngine 2, the engine used in Crysis. However, it should be noted that Crysis delivers greater character and object texture detail, as well as more destructible elements within the environment. For example; trees breaking into many smaller pieces and buildings breaking down to their component panels. Far Cry 2 also supports the amBX technology from Philips. With the proper hardware, this adds effects like vibrations, ambient colored lights, and fans that generate wind effects.
There is a benchmark tool in the PC version of Far Cry 2, which offers an excellent array of settings for performance testing. Benchmark Reviews used the maximum settings allowed for our tests, with the resolution set to 1920x1200. The performance settings were all set to 'Very High', Render Quality was set to 'Ultra High' overall quality level, 8x anti-aliasing was applied, and HDR and Bloom were enabled. Of course DX10 was used exclusively for this series of tests.
Although the Dunia engine in Far Cry 2 is slightly less demanding than CryEngine 2 engine in Crysis, the strain appears to be extremely close. In Crysis we didn't dare to test AA above 4x, whereas we use 8x AA and 'Ultra High' settings in Far Cry 2. Here we see the opposite effect, when switching our testing to DirectX 10. Far Cry 2 seems to have been optimized, or at least written with a clear understanding of DX10 requirements.
Using the short 'Ranch Small' time demo (which yields the lowest FPS of the three tests available), not all products are capable of producing playable frame rates with the settings all turned up. The Radeon HD5750 actually hangs close to its big brother, the HD5770 in this game. Although the Dunia engine seems to be optimized for NVIDIA chips, the improvements ATI incorporated in their latest GPUs are just enough to allow this game to be played with a mid-range card. Older ATI products struggle with this benchmark, and if you've got one of those, either the HD5750 or HD5770 would be an upgrade for playing this game.
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
Our next benchmark of the series puts our collection of video cards against some fresh graphics in the newly released Resident Evil 5 benchmark.
Resident Evil 5 Benchmark Results
PC gamers get the ultimate Resident Evil package in this new PC version with exclusive features including NVIDIA's new GeForce 3D Vision technology (wireless 3D Vision glasses sold separately), new costumes and a new mercenaries mode with more enemies on screen. Delivering an infinite level of detail, realism and control, Resident Evil 5 is certain to bring new fans to the series. Incredible changes to game play and the world of Resident Evil make it a must-have game for gamers across the globe.
Years after surviving the events in Raccoon City, Chris Redfield has been fighting the scourge of bio-organic weapons all over the world. Now a member of the Bio-terrorism Security Assessment Alliance (BSSA), Chris is sent to Africa to investigate a biological agent that is transforming the populace into aggressive and disturbing creatures. New cooperatively-focused game play revolutionizes the way that Resident Evil is played. Chris and Sheva must work together to survive new challenges and fight dangerous hordes of enemies.
From a gaming performance perspective, Resident Evil 5 uses Next Generation of Fear - Ground breaking graphics that utilize an advanced version of Capcom's proprietary game engine, MT Framework, which powered the hit titles Devil May Cry 4, Lost Planet and Dead Rising. The game uses a wider variety of lighting to enhance the challenge. Fear Light as much as Shadow - Lighting effects provide a new level of suspense as players attempt to survive in both harsh sunlight and extreme darkness. As usual, we maxed out the graphics settings on the benchmark version of this popular game, to put the hardware through its paces. Much like Devil May Cry 4, it's relatively easy to get good frame rates in this game, so take the opportunity to turn up all the knobs and maximize the visual experience.
The Resident Evil5 benchmark tool provides a graph of continuous frame rates and averages for each of four distinct scenes. In addition it calculates an overall average for the four scenes. The overall average is what we report here, as the scenes were pretty evenly matched and no scene had results that were so far above or below the average as to present a unique situation.
The 1680x1050 test results from this game scale almost as linearly as a synthetic benchmark. In the case of the video card we're interested in, the HD5750 sits on the exact same rung as the HD4850 and 6-7 FPS behind the HD5770. The GTX260-216 and GTX275 do very well in this game, beating both new ATI offerings easily.
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
Our next benchmark of the series features a strategy game with photorealistic modern-day wartime graphics: World in Conflict.
World in Conflict Benchmark Results
The latest version of Massive's proprietary Masstech engine utilizes DX10 technology and features advanced lighting and physics effects, and allows for a full 360 degree range of camera control. Massive's MassTech engine scales down to accommodate a wide range of PC specifications, if you've played a modern PC game within the last two years, you'll be able to play World in Conflict.
World in Conflict's FPS-like control scheme and 360-degree camera make its action-strategy game play accessible to strategy fans and fans of other genres... if you love strategy, you'll love World in Conflict. If you've never played strategy, World in Conflict is the strategy game to try.
Based on the test results charted below it's clear that WiC doesn't place a limit on the maximum frame rate (to prevent a waste of power) which is good for full-spectrum benchmarks like ours, but bad for electricity bills. The average frame rate is shown for each resolution in the chart below. World in Conflict just begins to place demands on the graphics processor at the 1680x1050 resolution, so we'll skip the low-res testing.
The GT200 series GPUs from NVIDIA seem to have a distinct advantage with the World In Conflict benchmark. Once again, the older HD4850 improves on the performance of the HD5750, even in the higher resolution testing this time, despite having only 512MB of memory to play with. Something is clearly not optimized in this benchmark for the latest ATI version of pixel processing hardware.
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
Our last benchmark of the series brings DirectX 11 into the mix, a situation that only two of the cards under test are capable of handling.
BattleForge - Renegade Benchmark Results
In anticipation of the Release of DirectX 11 with Windows 7 and coinciding with the release of AMD's ATI HD 5870, BattleForge has been updated to allow it to run using DirectX 11 on supported hardware. Well what does all of this actually mean you may ask? It gives us a sip of water from the Holy Grail of game designing and computing in general: greater efficiency! What does this mean for you? It means that that the game will demonstrate a higher level of performance for the same processing power, which in turn allows more to be done with the game graphically. In layman's terms the game will have a higher frame rate and new ways of creating graphical effects, such as shadows and lighting. The culmination of all of this is a game that both runs and looks better. The game is running on a completely new graphics engine that was built for BattleForge.
BattleForge is a next-gen real time strategy game, in which you fight epic battles against evil along with your friends. What makes BattleForge special is that you can assemble your army yourself: the units, buildings and spells in BattleForge are represented by collectible cards that you can trade with other players. BattleForge is developed by EA Phenomic. The studio was founded by Volker Wertich, father of the classic "The Settlers" and the SpellForce series. Phenomic has been an EA studio since August 2006.
BattleForge was released on Windows in March 2009. On May 26, 2009, BattleForge became a Play 4 Free branded game with only 32 of the 200 cards available. In order to get additional cards, players will now need to buy points on the BattleForge website. The retail version comes with all of the starter decks and 3,000 BattleForge points.
Never mind the DX10 v. DX11 question, the real news here is that this game was almost certainly developed exclusively on ATI hardware, and it shows. The good news is that at both widescreen resolutions, the HD5750 trumps the GTX260, and comes within spitting distance of the GTX275, an almost unthinkable result. The bad news is that the old HD4850 does even better.
The BattleForge benchmark itself is a tough one, once all the settings are maxed out. In case you are wondering, these results are with SSAO "On" and set to the Very High setting. I know the NVIDIA cards do a little better when SSAO is set to "Off", and I will eventually get around to posting a full set of results with this setting. Personally though, I think the writing is on the wall as far as DirectX 11 goes, and if there isn't going to be a level playing field for 3-4 months, it's not ATI's fault. I mean, who DIDN'T know, more than a year ago, that Windows 7 and DX11 were coming?
|
Product Series |
MSI Radeon HD4830 (R4830 T2D512) |
ASUS Radeon HD4850 (EAH4850 TOP) |
XFX Radeon HD5750 (HD-575X-ZN) |
ATI Radeon HD5770 (Engineering Sample) |
ASUS GeForce GTX 260 (ENGTX260 MATRIX) |
MSI GeForce GTX 275 (N275GTX Twin Frozr OC) |
|
Stream Processors |
640 |
800 |
720 |
800 |
216 |
240 |
|
Core Clock (MHz) |
585 |
680 |
700 |
850 |
576 |
66 |
|
Shader Clock (MHz) |
N/A |
N/A |
N/A |
N/A |
1242 |
1476 |
|
Memory Clock (MHz) |
900 |
1050 |
1150 |
1200 |
999 |
1161 |
|
Memory Amount |
512MB - GDDR3 |
512MB - GDDR3 |
1024MB-GDDR5 |
1024MB-GDDR5 |
896MB - GDDR3 |
896MB - GDDR3 |
|
Memory Interface |
256-bit |
256-bit |
128-bit |
128-bit |
448-Bit |
448-bit |
In our next section, we investigate the thermal performance of the Radeon HD5750, and see if that half-size 40nm GPU die runs as cool as we think it will.
XFX Radeon HD5770 Temperature
It's hard to know exactly when the first video card got overclocked, and by whom. What we do know is that it's hard to imagine a computer enthusiast or gamer today that doesn't overclock their hardware. Of course, not every video card has the head room. Some products run so hot that they can't suffer any higher temperatures than they generate straight from the factory. This is why we measure the operating temperature of the video card products we test.
To begin testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark 1.7.0 to generate maximum thermal load and record GPU temperatures at high-power 3D mode. The ambient room temperature remained stable at 21C throughout testing (it cooled off this week in DC...). The ATI Radeon HD5750 video card recorded 33C in idle 2D mode, and increased to 62C after 20 minutes of stability testing in full 3D mode, at 1920x1200 resolution and the maximum MSAA setting of 8X. The fan was left on its stock settings for this test.
62°C is an excellent result for temperature stress testing, especially with stock fan settings. The built-in fan controller generally runs the fan at 1200RPM during 2D or idle. On most benchmarks, the temperature never got above 60C and the fan stayed there. Once temps got above 60C, the controller ramped up the fan to about 1400 RPM. With a few less stream processors and a lower GPU clock rate than the HD5770, it seemed like you really couldn't push the thermal boundaries of this card. Overclockers are licking their chops about now....
FurMark is an OpenGL benchmark that heavily stresses and overheats the graphics card with fur rendering. The benchmark offers several options allowing the user to tweak the rendering: fullscreen / windowed mode, MSAA selection, window size, duration. The benchmark also includes a GPU Burner mode (stability test). FurMark requires an OpenGL 2.0 compliant graphics card with lot of GPU power! As an oZone3D.net partner, Benchmark Reviews offers a free download of FurMark to our visitors.
FurMark does do two things extremely well: drive the thermal output of any graphics processor higher than any other application or video game, and it does so with consistency every time. While FurMark is not a true benchmark tool for comparing different video cards, it still works well to compare one product against itself using different drivers or clock speeds, or testing the stability of a GPU, as it raises the temperatures higher than any program. But in the end, it's a rather limited tool.
In our next section, we discuss electrical power consumption and learn how well (or poorly) each video card will impact your utility bill...
VGA Power Consumption
Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now.
To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:
VGA Product Description(sorted by combined total power) |
Idle Power |
Loaded Power |
|---|---|---|
NVIDIA GeForce GTX 480 SLI Set |
82 W |
655 W |
NVIDIA GeForce GTX 590 Reference Design |
53 W |
396 W |
ATI Radeon HD 4870 X2 Reference Design |
100 W |
320 W |
AMD Radeon HD 6990 Reference Design |
46 W |
350 W |
NVIDIA GeForce GTX 295 Reference Design |
74 W |
302 W |
ASUS GeForce GTX 480 Reference Design |
39 W |
315 W |
ATI Radeon HD 5970 Reference Design |
48 W |
299 W |
NVIDIA GeForce GTX 690 Reference Design |
25 W |
321 W |
ATI Radeon HD 4850 CrossFireX Set |
123 W |
210 W |
ATI Radeon HD 4890 Reference Design |
65 W |
268 W |
AMD Radeon HD 7970 Reference Design |
21 W |
311 W |
NVIDIA GeForce GTX 470 Reference Design |
42 W |
278 W |
NVIDIA GeForce GTX 580 Reference Design |
31 W |
246 W |
NVIDIA GeForce GTX 570 Reference Design |
31 W |
241 W |
ATI Radeon HD 5870 Reference Design |
25 W |
240 W |
ATI Radeon HD 6970 Reference Design |
24 W |
233 W |
NVIDIA GeForce GTX 465 Reference Design |
36 W |
219 W |
NVIDIA GeForce GTX 680 Reference Design |
14 W |
243 W |
Sapphire Radeon HD 4850 X2 11139-00-40R |
73 W |
180 W |
NVIDIA GeForce 9800 GX2 Reference Design |
85 W |
186 W |
NVIDIA GeForce GTX 780 Reference Design |
10 W |
275 W |
NVIDIA GeForce GTX 770 Reference Design |
9 W |
256 W |
NVIDIA GeForce GTX 280 Reference Design |
35 W |
225 W |
NVIDIA GeForce GTX 260 (216) Reference Design |
42 W |
203 W |
ATI Radeon HD 4870 Reference Design |
58 W |
166 W |
NVIDIA GeForce GTX 560 Ti Reference Design |
17 W |
199 W |
NVIDIA GeForce GTX 460 Reference Design |
18 W |
167 W |
AMD Radeon HD 6870 Reference Design |
20 W |
162 W |
NVIDIA GeForce GTX 670 Reference Design |
14 W |
167 W |
ATI Radeon HD 5850 Reference Design |
24 W |
157 W |
NVIDIA GeForce GTX 650 Ti BOOST Reference Design |
8 W |
164 W |
AMD Radeon HD 6850 Reference Design |
20 W |
139 W |
NVIDIA GeForce 8800 GT Reference Design |
31 W |
133 W |
ATI Radeon HD 4770 RV740 GDDR5 Reference Design |
37 W |
120 W |
ATI Radeon HD 5770 Reference Design |
16 W |
122 W |
NVIDIA GeForce GTS 450 Reference Design |
22 W |
115 W |
NVIDIA GeForce GTX 650 Ti Reference Design |
12 W |
112 W |
ATI Radeon HD 4670 Reference Design |
9 W |
70 W |
The ATI Radeon HD5750 pulled 18 (96-78) watts at idle and 94 (172-78) watts when running full out, using the test method outlined above. These numbers are reasonably close to the factory numbers of 16W at idle and 86W under load. This is one area where this card excels. If you keep your computer running most of the day and/or night, this card could easily save you 1 kWh per day in electricity.
Radeon HD5750 Final Thoughts
The alternative title for this review could have been: "What Price DirectX 10?" or "Who Killed Crysis?" I know the big news is DirectX 11, and how it is a major advancement in both image quality and coding efficiency, but for the time being, we're stuck in a DirectX 10 world, for the most part. DX11 games won't be thick on the ground for at least a year, and some of us are going to continue playing our old favorites. So, with the switch to Windows 7, what's the impact on gaming performance? So far, it's a bit too random for my tastes.
We seem to be back to a situation where the software differences between games have a bigger influence on performance than hardware and raw pixel processing power. As the adoption rate for Windows 7 ramps up, more and more gamers are going to be wondering if DirectX 10 is a blessing or a curse. Crysis gets cut off at the knees, but Far Cry 2 gets a second wind with DX10. World In Conflict holds back its best game play for NVIDIA customers, but BattleForge swings the other way, with DX10 and DX11.
I have a feeling this is why gamers resolutely stuck with Windows XP, and never warmed up to Vista. It wasn't the operating system per se, as much as it was DirectX 10. And I want to clarify; there's probably nothing inherently wrong with DX10, it's just that so few games were designed to use it effectively. The other problem is that, unlike other image enhancing features, DirectX has no sliding scale. I can't select 2x or 4x or 8x, to optimize the experience, it's either all in, or all out.
The good news is that the adoption rate for Windows 7 will probably set records, if anyone is keeping score. Combine that with the real-world benefit to software coders that DirectX 11 brings, and there is a good probability that we won't be stuck in DX10 land for very long. New graphics hardware from both camps, a new operating system, a new graphics API, and maybe an economic recovery in the works? It's going to be an interesting holiday season, this year!
XFX Radeon HD5750 Conclusion
The performance of the HD5750 is pretty good, considering the modest looking hardware resources that make it all possible. One way of showing this quantitatively is to look at the power required to deliver the performance. The HD5750 offers roughly the same performance as an HD4850 for half the power at full load, and only one third the power at idle. The difference could easily equal a savings of 1kWh per day. That's a nice perk for new users, but for those that may already have a mid-range card that's 1-2 years old, getting the equivalent performance of an HD4850 in late 2009 may not be enough. It's OK to want more, even in a world barely recovering from a global recession. We'll talk value in a minute, but the performance is what it is; it's competitive, not a giant killer. The presence of 1 GB of GDDR5 memory really helps at higher resolutions, so the card won't hold you back if you pick up a new monitor.
Performance is more than just frames-per-second, though; the ability to run 2-3 monitors with Full ATI EyeFinity Support counts, too. Plus, we've been measuring performance with Beta drivers. If you've read some of my recent video card reviews, you've got a better understanding of why driver performance on launch day is not a good measure of the final product. So, while the raw performance numbers are good enough for the target price point today, I predict even better things to come for both price and performance.
The appearance of the product itself is a mixed bag. The card uses the reference cooler designed by ATI, and early pictures of the unadorned black fan shroud looked pretty goofy. Once XFX got their graphics artists to work up a product label, they improved the appearance by a large margin. The full cover and red hood scoops from the rest of the HD5xxx family were too expensive for this product, and the cheaper cooling system helped pay for the premium memory system; an excellent trade-off, I'd say. The reference design has plenty of cooling capacity for the tiny Juniper GPU, especially with the lower GPU clock and 80 disabled stream processors.
The build quality of the XFX Radeon 5750 is much better than the engineering sample I received before the launch date. The retail version XFX is putting out had no quality issues I could detect. The parts were all high quality, and the PC board was manufactured and assembled with care and precision.
The features of the HD5750 are amazing, having been carried over in full measure from the HD5800 series: DirectX 11, Full ATI Eyefinity Support, ATI Stream Technology Support, DirectCompute 11 and OpenCL Support, HDMI 1.3 with Dolby True HD and DTS Master Audio. We've barely scratched the surface in this review of all the capability on offer, by focusing almost exclusively on gaming performance, but the card has other uses as well.
As of late October, Newegg is selling the XFX Radeon HD5750 at $139.99, which is $10 higher than several others vendors. XFX has always commanded a premium for their cards, because of their enthusiast-based support model. They offer a double lifetime warranty, which is quite useful for enthusiasts that buy and sell the latest hardware on a regular basis. The second owner gets the second full lifetime warranty. That's a very nice benefit if you know the guy that owned it before you ran it 24/7, highly overclocked at full load, loading up on points in Folding@Home. I think this is less likely to be an issue with a card in this price range, but it does explain the price premium a bit. I feel somewhat disappointed that inflation seems to have eaten up the cost advantage I had hoped to see over the existing HD4850, but progress is measured more in the feature set of this card than in its raw graphics processing power.
The XFX Radeon HD5750 earns a Silver Tachometer Award, because it fills an important slot in the graphics card middle ground. With the launch of Windows 7 and its DirectX 11 interface, anyone wanting to take advantage of the advanced features becoming more prevalent in the next 4-6 months needs new hardware. If you're shopping in this price range, this is the only card to get; every other choice is going to cost more or do less. I was a little disappointed that XFX didn't include a coupon for DiRT 2, as some other ATI partners are doing; it's one of the DirectX 11 titles I'm most looking forward to.
Pros:
+ Unmatched feature set
+ Extremely low power consumption
+ 1GB of GDDR5 memory
+ HDMI and DisplayPort interfaces
+ Cool, quiet operation
+ Requires only one 6-pin power connector
+ Sleek, modern looks
+ Lowest cost DirectX 11 graphics card
Cons:
- Can't quite beat old-faithful; the HD4850
- Premium pricing at launch
Ratings:
-
Performance: 8.50
-
Appearance: 9.00
-
Construction: 9.00
-
Functionality: 9.50
-
Value: 8.75
Final Score: 8.95 out of 10.
Quality Recognition: Benchmark Reviews Silver Tachometer Award..
Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
- Best CPU Cooler Performance Intel/AMD Q1-2010
- NVIDIA GeForce GTS 450 GF106 Video Card
- ASUS GeForce GTX 460 SLI Performance
- Hanns-G HZ281HPB WideScreen LCD Monitor
- Lian Li PC7B Plus II Mid-Tower Case
- Mtron MOBI 3000 2.5-Inch 16GB SSD MSD-SATA3025
- Gigabyte GeForce GTX 480 SOC GV-N480SO-15I
- AMD Phenom II X3 720 BE Black Edition AM3 CPU
- Cooler Master Sniper CM Storm Case
- ASUS ENGTX480/2DI/1536MD5 GeForce GTX 480

)