| ATI Radeon HD5450 HTPC Video Card |
| Reviews - Featured Reviews: Video Cards | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Written by Bruce Normann | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Thursday, 04 February 2010 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ATI Radeon HD5450 Video Card ReviewJust when I thought they had finished cutting halves, ATI has taken the 40nm Cypress architecture to a new low. Low power, that is. In a brand new design, unlike anything they have released with this architecture, ATI is going after the Home Theater PC market with their heat sinks blazing. OK, I exaggerate; the Radeon HD5450 video card actually runs pretty cool, which is the point, really. It's silent, too, with a large and lovely red heatsink sitting atop the tiny GPU, sans fan. Follow along with Benchmark Reviews as we investigate an early sample of ATI's new standard bearer for low-power HTPC applications.
With the architecture it inherits from the Cypress, the ATI HD5450 has all the modern features that the larger GPU brings to the table. However, sporting only 292 million transistors, including just 80 Stream Processors, the new card idles along at 6.4 watts and never pulls more than 20 watts; no matter how hard you drive it. They've even managed to do this without the energy-saving benefits of GDDR5 this time, as the card will be equipped with GDDR3 or GDDR2, depending on the model and preference of the AIB partner.
The flagship ATI video cards made a huge splash in September, but according to Mercury Research, cards costing over $200 only make up 7% of the market, and the 57xx series landed in the $100-$200 range, which makes up 27% of the market. That leaves a huge opening in the sub-$100 market, and ATI is filling in the gaps with all new, DirectX 11 capable cards in this segment. The specs of the HD5450 indicate a performance level that will struggle with gaming, even at moderate resolution, but will have no problem supporting all the latest application in the home theater environment. About the company: ATI
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
-
Foxconn GeForce 8400GS (8400GS-256 Forceware v195.62 WHQL)
-
ATI Radeon HD5450 (Mfr. Sample Catalyst 8.69 RC3)
-
EVGA GeForce 8600GT (256-P2-N751-TR Forceware v195.62 WHQL)
-
ATI Radeon HD5670 (Mfr. Sample Catalyst 8.69 RC3)
-
MSI Radeon HD4830 (R4830 T2D512 Catalyst 8.69_RC3)
-
XFX Radeon HD5750 (HD-575X-ZN Catalyst 8.69_RC3)
Now we're ready to begin testing video game performance these video cards, so please continue to the next page where we start off with our 3DMark Vantage results.
3DMark Vantage Benchmark Results
3DMark Vantage is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 10 performance of 3D game performance with graphics cards. A 3DMark score is an overall measure of your system's 3D gaming capabilities, based on comprehensive real-time 3D graphics and processor tests. By comparing your score with those submitted by millions of other gamers you can see how your gaming rig performs, making it easier to choose the most effective upgrades or finding other ways to optimize your system.
There are two graphics tests in 3DMark Vantage: Jane Nash (Graphics Test 1) and New Calico (Graphics Test 2). The Jane Nash test scene represents a large indoor game scene with complex character rigs, physical GPU simulations, multiple dynamic lights, and complex surface lighting models. It uses several hierarchical rendering steps, including for water reflection and refraction, and physics simulation collision map rendering. The New Calico test scene represents a vast space scene with lots of moving but rigid objects and special content like a huge planet and a dense asteroid belt.
At Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, 3DMark is a reliable tool for comparing graphic cards against one-another.
1680x1050 is rapidly becoming the new 1280x1024. More and more widescreen are being sold with new systems or as upgrades to existing ones. Even in tough economic times, the tide cannot be turned back; screen resolution and size will continue to creep up. These two, relatively low, resolutions are the most appropriate for a review of mainstream hardware, and we'll also be using the following reduced settings for 3DMark Vantage: No Anti-Aliasing, 2x Anisotropic Filtering, all quality levels at Entry, and Post Processing Scale set at 1:2.
Test one, all about the exploits of our fictional espionage heroine Jane Nash, has some wonderful graphics when quality levels are cranked up. The water modeling is exceptionally accurate and detailed. This is only a synthetic benchmark, so the results we get, in terms of frames-per-second are not always typical of real world gaming performance, but still.....I was hoping for somewhat better performance from the reduced specification hardware. Just for the record, the HD5450 slots in between the GeForce 8400GS and 8600GT for raw 3D graphics performance, but I have serious doubts if any of these cards will be able to hack it when we start up the actual gaming applications.
Test two is a little more challenging for most video cards, due to the large number of irregularly shaped asteroids that need to be rendered in New Calico. Once again, the HD5450 just barely gets out of the gate, with average FPS numbers in the low single digits. Dropping down to 1280x1024 doesn't really help. We really need to look at actual gaming performance to verify these results, so let's take a look in the next section, at how these cards stack up in the traditional standard bearer for gaming benchmarks, Crysis.
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
Crysis Benchmark Results
Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX 10) framework, but can also run using DirectX 9, on Vista, Windows XP and the new Windows 7. As we'll see, there are significant frame rate reductions when running Crysis in DX10. It's not an operating system issue, DX9 works fine in WIN7, but DX10 knocks the frame rates in half.
Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.
Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.
Once again, we are going to concentrate on relatively low-resolution testing performance. At the 1280x1024 resolution used by 17"; and 19"; monitors, the CPU and memory may have some influence on the results, but our test rig in this case is well above the specs that a typical HD5450 user will have, so we've eliminated that variable. At 1280x1024, and the widescreen resolutions of 1680x1050, the performance differences between the video cards under test are mostly down to the cards.
The results here are pretty consistent with 3DMark Vantage, in that the HD5450 is stuck in a range close to, but below the teens. For most users, the performance of the HD5450 in this challenging scenario is going to be frustrating, ultimately unacceptable, and bordering on painful. Comparing the HD5450 directly against its higher priced siblings shows that performance goes up almost exponentially. The HD5670 costs twice as much, but has almost ten times the performance in this real-world application. Anyone looking to play a little Crysis on their 1080P HTPC needs to look at either the HD5670 or perhaps the soon-to-be-released HD5570 as a minimum starting point.
Once a decent amount of anti-aliasing is factored in, the situation degrades even further. Frame rates are way below acceptable until you get close to the $100 mark with the HD5670. If you want to play this game in DX10 with eye candy turned on, you are going to have to pay, both with cash as well as increased heat and power consumption.
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
In our next section, Benchmark Reviews tests with Devil May Cry 4 Benchmark. Read on to see how a blended high-demand GPU test with low video frame buffer demand will impact our test products.
Devil May Cry 4 Benchmark
Devil May Cry 4 was released for the PC platform in early 2007 as the fourth installment to the Devil May Cry video game series. DMC4 is a direct port from the PC platform to console versions, which operate at the native 720P game resolution with no other platform restrictions. Devil May Cry 4 uses the refined MT Framework game engine, which has been used for many popular Capcom game titles over the past several years.
MT Framework is an exclusive seventh generation game engine built to be used with games developed for the PlayStation 3 and Xbox 360, and PC ports. MT stands for "Multi-Thread", "Meta Tools" and "Multi-Target". Originally meant to be an outside engine, but none matched their specific requirements in performance and flexibility. Games using the MT Framework are originally developed on the PC and then ported to the other two console platforms.
On the PC version a special bonus called Turbo Mode is featured, giving the game a slightly faster speed, and a new difficulty called Legendary Dark Knight Mode is implemented. The PC version also has both DirectX 9 and DirectX 10 mode for Windows XP, Vista, and Widows 7 operating systems.
It's always nice to be able to compare the results we receive here at Benchmark Reviews with the results you test for on your own computer system. Usually this isn't possible, since settings and configurations make it nearly difficult to match one system to the next; plus you have to own the game or benchmark tool we used.
Devil May Cry 4 fixes this, and offers a free benchmark tool available for download. Because the DMC4 MT Framework game engine is rather low-demand for today's cutting edge video cards, Benchmark Reviews used the 1680x1050 resolution to test with 8x AA (highest AA setting available to Radeon HD video cards) and 16x AF.
Devil May Cry 4 is not as demanding a benchmark as it used to be. Only scene #2 and #4 are worth looking at from the standpoint of trying to separate the fastest video cards from the slower ones. Still, it represents a typical environment for many games that our readers still play on a regular basis, so it's good to see what works with it and what doesn't. Any of the tested cards will do a credible job in this application, and the performance scales in a pretty linear fashion. You get what you pay for when running this game, at least for benchmarks. This is one time where you can generally use the maximum available anti-aliasing settings, so NVIDIA users should feel free to crank it up to 16X. The DX10 "penalty" is of no consequence here.
The HD5450, once again performs better than the 8400GS, but slightly lower than the 8600GT. As usual, we get better frame rates in this test, but the three low-end cards still can't make it up to the recommended 30FPS minimum. This is one case where you can achieve full performance with a mainstream graphics card, but the HD5450 is so tightly optimized for HTPC usage that it won't be successful in this more demanding role.
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
Our next benchmark of the series is for a very popular FPS game that rivals Crysis for world-class graphics, and is more representative of games optimized for DirectX10. Maybe we'll get some relief there...
Far Cry 2 Benchmark Results
Ubisoft has developed Far Cry 2 as a sequel to the original, but with a very different approach to game play and story line. Far Cry 2 features a vast world built on Ubisoft's new game engine called Dunia, meaning "world", "earth" or "living" in Farci. The setting in Far Cry 2 takes place on a fictional Central African landscape, set to a modern day timeline.
The Dunia engine was built specifically for Far Cry 2, by Ubisoft Montreal development team. It delivers realistic semi-destructible environments, special effects such as dynamic fire propagation and storms, real-time night-and-day sun light and moon light cycles, dynamic music system, and non-scripted enemy A.I actions.
The Dunia game engine takes advantage of multi-core processors as well as multiple processors and supports DirectX 9 as well as DirectX 10. Only 2 or 3 percent of the original CryEngine code is re-used, according to Michiel Verheijdt, Senior Product Manager for Ubisoft Netherlands. Additionally, the engine is less hardware-demanding than CryEngine 2, the engine used in Crysis. However, it should be noted that Crysis delivers greater character and object texture detail, as well as more destructible elements within the environment. For example; trees breaking into many smaller pieces and buildings breaking down to their component panels. Far Cry 2 also supports the amBX technology from Philips. With the proper hardware, this adds effects like vibrations, ambient colored lights, and fans that generate wind effects.
There is a benchmark tool in the PC version of Far Cry 2, which offers an excellent array of settings for performance testing. Benchmark Reviews used slightly lower settings for this test, with the resolution set to 1280x1024 and 1680x1050. The performance settings were all set to 'Medium', Render Quality was set to 'Optimum' (which was the same as "High" in this case), 4x anti-aliasing was applied, and HDR and Bloom were enabled. Of course DX10 was used exclusively for this series of tests.
Although the Dunia engine in Far Cry 2 is slightly less demanding than CryEngine 2 engine in Crysis, the strain appears to be similar. Far Cry 2 also seems to have been optimized for, or at least written with a clear understanding of, DX10 requirements.
Using the short 'Ranch Small' time demo (which yields the lowest FPS of the three tests available), only the more robust video cards are capable of producing playable frame rates with moderate settings applied. The Radeon HD5450 is stuck in the lower range again. Although the Dunia engine seems to be optimized for NVIDIA chips, I can't lay the blame there, as the 8400GS and 8600GT don't really do much better. Once again, the HD5670 represents a reasonable jumping off point for the lowest cost choice that will still perform reliably.
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
Our next benchmark of the series puts our collection of video cards against some very demanding graphics in the newly released Resident Evil 5 benchmark.
Resident Evil 5 Benchmark Results
PC gamers get the ultimate Resident Evil package in this new PC version with exclusive features including NVIDIA's new GeForce 3D Vision technology (wireless 3D Vision glasses sold separately), new costumes and a new mercenaries mode with more enemies on screen. Delivering an infinite level of detail, realism and control, Resident Evil 5 is certain to bring new fans to the series. Incredible changes to game play and the world of Resident Evil make it a must-have game for gamers across the globe.
Years after surviving the events in Raccoon City, Chris Redfield has been fighting the scourge of bio-organic weapons all over the world. Now a member of the Bio-terrorism Security Assessment Alliance (BSSA), Chris is sent to Africa to investigate a biological agent that is transforming the populace into aggressive and disturbing creatures. New cooperatively-focused game play revolutionizes the way that Resident Evil is played. Chris and Sheva must work together to survive new challenges and fight dangerous hordes of enemies.
From a gaming performance perspective, Resident Evil 5 uses Next Generation of Fear - Ground breaking graphics that utilize an advanced version of Capcom's proprietary game engine, MT Framework, which powered the hit titles Devil May Cry 4, Lost Planet and Dead Rising. The game uses a wider variety of lighting to enhance the challenge. Fear Light as much as Shadow - Lighting effects provide a new level of suspense as players attempt to survive in both harsh sunlight and extreme darkness. As usual, we maxed out the graphics settings on the benchmark version of this popular game, to put the hardware through its paces. Much like Devil May Cry 4, it's relatively easy to get good frame rates in this game, so take the opportunity to turn up all the knobs and maximize the visual experience.
The Resident Evil5 benchmark tool provides a graph of continuous frame rates and averages for each of four distinct scenes. In addition it calculates an overall average for the four scenes. The averages for scene #3 and #4 are what we report here, as they are the most challenging.
This new game is still not quite playable with the HD5450 hardware, even at lower screen resolutions. You still need to use one of the higher powered cards to achieve acceptable frame rates. Even though it caught up to the 8600GT in this case, the results are still way too low to be useable.
|
Product Series |
Foxconn GeForce 8400GS (8400GS-256) |
ATI Radeon HD5450 (Mfr. Sample) |
EVGA GeForce 8600GT (256-P2-N751-TR) |
ATI Radeon HD5670 (Mfr. Sample) |
MSI Radeon HD4830 (R4830 T2D512) |
XFX Radeon HD5750 (HD-575X-ZN) |
|
Stream Processors |
16 |
80 |
32 |
400 |
640 |
720 |
|
Core Clock (MHz) |
450 |
650 |
540 |
775 |
585 |
700 |
|
Shader Clock (MHz) |
900 |
N/A |
1180 |
N/A |
N/A |
N/A |
|
Memory Clock (MHz) |
800 |
800 |
700 |
1000 |
900 |
1150 |
|
Memory Amount |
256MB - GDDR2 |
512MB - GDDR3 |
256MB-GDDR3 |
51MB-GDDR5 |
512MB-GDDR5 |
1024MB - GDDR5 |
|
Memory Interface |
64-bit |
64-bit |
128-bit |
128-Bit |
256-Bit |
128-bit |
Our next sections look at thermal performance and power consumption, both key qualities for this new product.
ATI Radeon HD5450 Temperature
It's hard to know exactly when the first video card got overclocked, and by whom. What we do know is that it's hard to imagine a computer enthusiast or gamer today that doesn't overclock their hardware. Of course, not every video card has the head room. Some products run so hot that they can't suffer any higher temperatures than they generate straight from the factory. This is why we measure the operating temperature of the video card products we test.
To begin testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark 1.7.0 to generate maximum thermal load and record GPU temperatures at high-power 3D mode. The ambient room temperature remained stable at 23C throughout testing. The ATI Radeon HD5450 video card recorded 30C in idle 2D mode, and increased to 43C after 20 minutes of stability testing in full 3D mode, at 1920x1200 resolution and the maximum MSAA setting of 8X. Obviously, there were no fan settings for this test, but the case I tested in has a large side panel fan.
43C is an impressive result for temperature stress testing, especially for a card that relies on passive cooling. This is a key performance measure for a card like this, and it delivers the goods.
FurMark is an OpenGL benchmark that heavily stresses and overheats the graphics card with fur rendering. The benchmark offers several options allowing the user to tweak the rendering: fullscreen / windowed mode, MSAA selection, window size, duration. The benchmark also includes a GPU Burner mode (stability test). FurMark requires an OpenGL 2.0 compliant graphics card with lot of GPU power! As an oZone3D.net partner, Benchmark Reviews offers a free download of FurMark to our visitors.
FurMark does do two things extremely well: drive the thermal output of any graphics processor higher than any other application or video game, and it does so with consistency every time. While FurMark is not a true benchmark tool for comparing different video cards, it still works well to compare one product against itself using different drivers or clock speeds, or testing the stability of a GPU, as it raises the temperatures higher than any program. But in the end, it's a rather limited tool.
In our next section, we discuss electrical power consumption and learn how well (or poorly) each video card will impact your utility bill...
VGA Power Consumption
Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now.
To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:
VGA Product Description(sorted by combined total power) |
Idle Power |
Loaded Power |
|---|---|---|
NVIDIA GeForce GTX 480 SLI Set |
82 W |
655 W |
NVIDIA GeForce GTX 590 Reference Design |
53 W |
396 W |
ATI Radeon HD 4870 X2 Reference Design |
100 W |
320 W |
AMD Radeon HD 6990 Reference Design |
46 W |
350 W |
NVIDIA GeForce GTX 295 Reference Design |
74 W |
302 W |
ASUS GeForce GTX 480 Reference Design |
39 W |
315 W |
ATI Radeon HD 5970 Reference Design |
48 W |
299 W |
NVIDIA GeForce GTX 690 Reference Design |
25 W |
321 W |
ATI Radeon HD 4850 CrossFireX Set |
123 W |
210 W |
ATI Radeon HD 4890 Reference Design |
65 W |
268 W |
AMD Radeon HD 7970 Reference Design |
21 W |
311 W |
NVIDIA GeForce GTX 470 Reference Design |
42 W |
278 W |
NVIDIA GeForce GTX 580 Reference Design |
31 W |
246 W |
NVIDIA GeForce GTX 570 Reference Design |
31 W |
241 W |
ATI Radeon HD 5870 Reference Design |
25 W |
240 W |
ATI Radeon HD 6970 Reference Design |
24 W |
233 W |
NVIDIA GeForce GTX 465 Reference Design |
36 W |
219 W |
NVIDIA GeForce GTX 680 Reference Design |
14 W |
243 W |
Sapphire Radeon HD 4850 X2 11139-00-40R |
73 W |
180 W |
NVIDIA GeForce 9800 GX2 Reference Design |
85 W |
186 W |
NVIDIA GeForce GTX 780 Reference Design |
10 W |
275 W |
NVIDIA GeForce GTX 770 Reference Design |
9 W |
256 W |
NVIDIA GeForce GTX 280 Reference Design |
35 W |
225 W |
NVIDIA GeForce GTX 260 (216) Reference Design |
42 W |
203 W |
ATI Radeon HD 4870 Reference Design |
58 W |
166 W |
NVIDIA GeForce GTX 560 Ti Reference Design |
17 W |
199 W |
NVIDIA GeForce GTX 460 Reference Design |
18 W |
167 W |
AMD Radeon HD 6870 Reference Design |
20 W |
162 W |
NVIDIA GeForce GTX 670 Reference Design |
14 W |
167 W |
ATI Radeon HD 5850 Reference Design |
24 W |
157 W |
NVIDIA GeForce GTX 650 Ti BOOST Reference Design |
8 W |
164 W |
AMD Radeon HD 6850 Reference Design |
20 W |
139 W |
NVIDIA GeForce 8800 GT Reference Design |
31 W |
133 W |
ATI Radeon HD 4770 RV740 GDDR5 Reference Design |
37 W |
120 W |
ATI Radeon HD 5770 Reference Design |
16 W |
122 W |
NVIDIA GeForce GTS 450 Reference Design |
22 W |
115 W |
NVIDIA GeForce GTX 650 Ti Reference Design |
12 W |
112 W |
ATI Radeon HD 4670 Reference Design |
9 W |
70 W |
The ATI Radeon HD5450 pulled 7 (137-130) watts at idle and 36 (166-130) watts when running full out, using the test method outlined above. These numbers are very close to the factory numbers of 6.4W at idle and 19.1W under load. With the video cards power demands dropping so low, this type of test starts to show its limitations. Nevertheless, the numbers are still in the ballpark.
I also tested power consumption while streaming 1080P video from YouTube. I waited until the clips were fully downloaded, and ran them full screen on my 1920x1200 monitor. I configured the graphics settings according to our helpful guide here on Benchmark Reviews, so that the bulk of the work was handled by the GPU, and also tweaked the visual settings to get maximum image quality. In the application it was designed for, mainly video streaming and HTPC, the Radeon HD5450 consumed slightly less power than it did during the FurMark stress test. Maximum power draw during 1080P video playback was 22 (152-130) watts, only two thirds of the power required to run FurMark.
Radeon HD5450 Final Thoughts
Most everyone who reads this site is familiar with the concept of diminishing returns. As you get closer to the highest level of performance (let's call that 100 %...), it costs considerably more to get the last 10% of performance than it does to go from 80% to 90%. When you look at two gaming-class video cards using the same technology, the increase in frames-per-second doesn't match the increase in price. The HD5850 and the HD5870 are a good example; does the 33% increase in price give you a 33% increase in performance? You wish.....which is why lots more people are buying the HD5850.
Just as the law of diminishing returns works on the high end of the market, there is a corresponding force at work on the lower end of the scale. As you move closer to the lowest possible cost, you start to bounce up against fixed costs that won't budge. Marketing, sales, design, testing, certification processes, transportation, packaging, and connector costs are all stubbornly rigid. Right now, the cheapest cards at Newegg, based on NVIDIA and ATI chips are the 8400GS and HD4350, priced at $30 and $36, respectively. I dare say, we're not going to see any new cards introduced that will be any cheaper than these are; it's just not fiscally possible, if we assume that the vendor is going to make a profit.
My point is, the vendor can try and cut every possible feature, performance enhancing hardware, included software, industrial design, packaging costs, etc. and end up with a product that barely functions, and it would still probably cost $25 on the retailer's shelf.
In my review of the HD5670, I wondered out loud, "How many more times ATI can slice the pie and still come up with a fully functional video card? Could there be one more cut, for an ultra-low power solution? But I think this is probably it, for a card that can honestly support gaming applications as well as general usage and HD video." As it turns out, the Radeon HD5450 is that fully functional low-power card, and I still think the Redwood class of ATI GPUs is the lowest you can go and still support modern games. The game changes when you look at HD video, however. This card eats it up for breakfast, and still has some headroom left over for whatever HW acceleration scheme comes along next.
As I sit here on the edge of my chair, waiting for dribs and drabs of information about the latest monster-sized GPU chip from NVIDIA (...hey, they named them this week. Wow), with a die size approaching the dimensions of the original Post-It note, I did wonder what the attraction was to a discrete graphics card with a GPU that's less than half the size of a US dime. The answer is that even the best Integrated Graphics Processor (IGP) is still less than half as powerful as the Radeon HD5450, and they generally max out with 128MB of SidePort GDDR3 memory. Many of them struggle to render full HD 1080P video smoothly, and the CPUs that they are bundled with usually can't help the effort much.
So, grab that old microATX board out of the closet, dust it off, add the Radeon HD5450, drop it into a shiny new, slim line HTPC box and you're off to the movies in style.
ATI Radeon HD5450 Conclusion
Looking at the performance of the ATI Radeon HD5450, you have to give up the idea that this is going to be any kind of solution for a gaming rig. In modern FPS games, it was well below any reasonable person's expectation for visual quality. Even at the reduced resolutions and quality levels that we introduced in our review of the HD5670 and GT240, the HD5450 just barely got into double digits for frames-per-second. This card is not really practical as a multi-purpose solution. We'll have to wait a bit for the HD55xx to see if it's possible to successfully bridge the two requirements of gaming and video playback. The strength of the HD5450 lies in Home Theater PC usage only, where it performs superbly. ATI is currently leading the game in image quality for HD video, and this small, low power, silent and cool board supports all the latest software enhancements that make those improved visuals possible.
The appearance of the passively cooled HD5450 is visually stunning. There are some really ugly passive cards out there, along with a few decent looking ones, but nothing comes close to the design statement that this one makes. AIB partners will have pretty much total flexibility to implement their own cooling systems, and I don't expect any one of them to top this. Batmobile indeed; this one is fine art, of the industrial design variety.
The build quality of the Radeon HD5450 was quite good, for an engineering sample. The parts were all high quality, the soldering and component placement were to a high standard, and the heat sink was manufactured and assembled perfectly.
The features of the HD5450 have been carried over in full measure from the very first HD58xx series: DirectX 11, Full ATI Eyefinity Support, ATI Stream Technology Support, DirectCompute 11 and OpenCL Support, HDMI 1.3a with Dolby True HD and DTS Master Audio. Nothing was left out on this card, despite it being produced for a clearly different role than the original barn burner gaming cards. Even though this card will not thrive in a multi-functional role, it still provides a solid HTPC experience and is a considerable upgrade for many systems still relying on IGP.
As of March 2010 there are several models available at different prices for the Radeon HD 5450, depending on DRAM configuration and cooling solution. PowerColor offers the AX5450 for $40, while the Sapphire 100291L lists for $43 and XFX HD5450 sells for $50. This is a small price premium from the lowest priced cards available from our favorite e-tailer, but launch pricing is always a bit high, for obvious reasons. We saw in our gaming tests that it takes an extra $50-70 to get decent results with challenging titles, but the extra performance also buys you higher power requirements, more noise and more heat.
The ATI Radeon HD5450 earns a Silver Tachometer Award, because there are some buyers that absolutely demand a passively cooled, completely silent video card, and they also need that card to support the latest technology and features for HD video playback. Until now, those two requirements were mutually exclusive; now there is a product; the one and only product, which completely meets their needs. The fact that it's impossible to build a dual-use card with 40nm technology that does all that and can play FPS games convincingly is a shame. Fortunately, we'll only have to wait a year or so, to see what 28nm GPUs can do.
Pros:
+ Modern feature set
+ Extremely low power consumption
+ Aggressive power modulation of GPU and RAM
+ Best video quality currently available
+ HDMI, VGA and DVI interfaces on single slot
+ Cool, silent operation
+ Truly awesome looks
+ Very low heat generation
Cons:
- High-end gaming titles are almost impossible to play
- AIB partners will probably mess with the good looks
Ratings:
-
Performance: 8.50
-
Appearance: 9.75
-
Construction: 9.25
-
Functionality: 8.75
-
Value: 8.50
Final Score: 8.95 out of 10.
Quality Recognition: Benchmark Reviews Silver Tachometer Award
Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
- HIS Radeon HD6950 IceQ-X Turbo-X Video Card
- Thecus N5550 NAS Network Storage Server
- WolfKing Trooper MVP Laser Gaming Mouse
- CM Storm Quickfire TK Mechanical Keyboard
- Benchmark Reviews Editors Choice Awards 2008
- ZOTAC GeForce GTX 260 AMP! Edition
- In Win BUC Computer Case
- ADATA DashDrive Elite HE720 USB 3.0 Hard Drive
- Seagate Cheetah 15K.7 SAS Hard Drive ST3600057SS
- Roccat ISKU Illuminated Gaming Keyboard

Comments
Please reply as early as possible
It's really better suited to a Home Theater PC.
Just to clarify such a connected fan would be of similar size and watts etc. to an inbuilt one?
Like I have a spare fan DC 12v 0.11A which is probably ok and another which is DC 12v 0.70A which is probably too powerful?
Thanks again
REPLY SOOOOON PLEASE
It was designed for HTPC use, which is much less demanding.
Plus, it was released three years ago, that's a LONG time in video card history. Why are you interested in it now? Can you even buy one in your location?