Archive Home arrow Reviews: arrow Video Cards arrow ATI Radeon HD5450 HTPC Video Card
ATI Radeon HD5450 HTPC Video Card
Reviews - Featured Reviews: Video Cards
Written by Bruce Normann   
Thursday, 04 February 2010

ATI Radeon HD5450 Video Card Review

Just when I thought they had finished cutting halves, ATI has taken the 40nm Cypress architecture to a new low. Low power, that is. In a brand new design, unlike anything they have released with this architecture, ATI is going after the Home Theater PC market with their heat sinks blazing. OK, I exaggerate; the Radeon HD5450 video card actually runs pretty cool, which is the point, really. It's silent, too, with a large and lovely red heatsink sitting atop the tiny GPU, sans fan. Follow along with Benchmark Reviews as we investigate an early sample of ATI's new standard bearer for low-power HTPC applications.

ATI_RADEON_HD5450_VIDEO_CARD_IO_Bottom_34_02.jpg

With the architecture it inherits from the Cypress, the ATI HD5450 has all the modern features that the larger GPU brings to the table. However, sporting only 292 million transistors, including just 80 Stream Processors, the new card idles along at 6.4 watts and never pulls more than 20 watts; no matter how hard you drive it. They've even managed to do this without the energy-saving benefits of GDDR5 this time, as the card will be equipped with GDDR3 or GDDR2, depending on the model and preference of the AIB partner.

ATI_RADEON_HD5450_VIDEO_CARD_Front_Top_01.jpg

The flagship ATI video cards made a huge splash in September, but according to Mercury Research, cards costing over $200 only make up 7% of the market, and the 57xx series landed in the $100-$200 range, which makes up 27% of the market. That leaves a huge opening in the sub-$100 market, and ATI is filling in the gaps with all new, DirectX 11 capable cards in this segment. The specs of the HD5450 indicate a performance level that will struggle with gaming, even at moderate resolution, but will have no problem supporting all the latest application in the home theater environment.

About the company: ATIati_premium_graphics_logo_200.png

Over the course of AMD's four decades in business, silicon and software have become the steel and plastic of the worldwide digital economy. Technology companies have become global pacesetters, making technical advances at a prodigious rate - always driving the industry to deliver more and more, faster and faster.

However, "technology for technology's sake" is not the way we do business at AMD. Our history is marked by a commitment to innovation that's truly useful for customers - putting the real needs of people ahead of technical one-upmanship. AMD founder Jerry Sanders has always maintained that "customers should come first, at every stage of a company's activities." We believe our company history bears that out.

Radeon HD5450 Features

The feature set of the ATI HD5450 video card is nearly identical to the entire HD5xxx series. The important differences are all related to the fact that the HD5450 chip is half the size of the HD5670, with half the transistors and a small fraction of the processing power. For those who first saw the monster version of this graphic at the launch of the 58xx series, it's amazing how little is left in the center of the diagram; only 2 SIMD Engines with a total of 80 Stream Processors. Rest assured; the architecture diagram may have gotten smaller, but the feature list hasn't shrunk at all.

ATI_RADEON_HD5450_VIDEO_CARD_Architecture.jpg

ATI Radeon HD 5450 GPU Feature Summary

  • 292 million 40nm transistors
  • TeraScale 2 Unified Processing Architecture
    • 80 Stream Processing Units
    • 8 Texture Units
    • 16 Z/Stencil ROP Units
    • 4 Color ROP Units
  • GDDR2/3 memory interface
  • PCI Express 2.1 x16 bus interface
  • DirectX 11 support
    • Shader Model 5.0
    • DirectCompute11
    • Programmable hardware tessellation unit
    • Accelerated multi-threading
    • HDR texture compression
    • Order-independent transparency
  • OpenGL 3.2 support1
  • Image quality enhancement technology
    • Up to 24x multi-sample and super-sample anti-aliasing modes
    • Adaptive anti-aliasing
    • 16x angle independent anisotropic texture filtering
    • 128-bit floating point HDR rendering
  • ATI Eyefinity multi-display technology2,3
    • Three independent display controllers - Drive three displays simultaneously with independent resolutions, refresh rates, color controls, and video overlays
    • Display grouping - Combine multiple displays to behave like a single large display
  • ATI Stream acceleration technology
    • OpenCL 1.0 compliant
    • DirectCompute11
    • Accelerated video encoding, transcoding, and upscaling4,5
    • Native support for common video encoding instructions
  • ATI CrossFireX multi-GPU technology6
    • Dual GPU scaling
  • ATI Avivo HD Video & Display technology7
    • UVD 2 dedicated video playback accelerator
    • Advanced post-processing and scaling8
    • Dynamic contrast enhancement and color correction
    • Brighter whites processing (blue stretch)
    • Independent video gamma control
    • Dynamic video range control
    • Support for H.264, VC-1, and MPEG-2
    • Dual-stream 1080p playback support9,10
    • DXVA 1.0 & 2.0 support
    • Integrated dual-link DVI output with HDCP11
      • Max resolution: 2560x160012
    • Integrated DisplayPort output
      • Max resolution: 2560x160012
    • Integrated HDMI 1.3 output with Deep Color, xvYCC wide gamut support, and high bit-rate audio
      • Max resolution: 1920x120012
    • Integrated VGA output
      • Max resolution: 2048x153612
    • 3D stereoscopic display/glasses support13
    • Integrated HD audio controller
      • Output protected high bit rate 7.1 channel surround sound over HDMI with no additional cables required
      • Supports AC-3, AAC, Dolby TrueHD and DTS Master Audio formats
  • ATI PowerPlayTM power management technology7
    • Dynamic power management with low power idle state
    • Ultra-low power state support for multi-GPU configurations
  • Certified drivers for Windows 7, Windows Vista, and Windows XP
  1. Driver support scheduled for release in 2010
  2. Driver version 8.66 (Catalyst 9.10) or above is required to support ATI Eyefinity technology and to enable a third display you require one panel with a DisplayPort connector
  3. ATI Eyefinity technology works with games that support non-standard aspect ratios which is required for panning across three displays
  4. Requires application support for ATI Stream technology
  5. Digital rights management restrictions may apply
  6. ATI CrossFireX technology requires an ATI CrossFireX Ready motherboard, an ATI CrossFireX Bridge Interconnect for each additional graphics card) and may require a specialized power supply
  7. ATI PowerPlay, ATI Avivo and ATI Stream are technology platforms that include a broad set of capabilities offered by certain ATI Radeon HD GPUs. Not all products have all features and full enablement of some capabilities and may require complementary products
  8. Upscaling subject to available monitor resolution
  9. Blu-ray or HD DVD drive and HD monitor required
  10. Requires Blu-ray movie disc supporting dual 1080p streams
  11. Playing HDCP content requires additional HDCP ready components, including but not limited to an HDCP ready monitor, Blu-ray or HD DVD disc drive, multimedia application and computer operating system.
  12. Some custom resolutions require user configuration
  13. Requires 3D Stereo drivers, glasses, and display

Now that we've confirmed that the 5450 is not missing any important features, let's look at the hardware specifications in detail.

Radeon HD5450 Video Card Specifications

I mentioned in the introduction that the HD5450 had hardware specs well below anything we've seen so far in the HD5xxx series. You can see the HD5450 specs in detail a little further below, and they're nothing to brag about, compared to the HD5670, until you get down to the power numbers. Then, it starts to make sense, as ATI has done everything they possibly could to get these numbers down to where they are. In a typical HTPC box, everything is jammed in tighter than sardines in a can, and the low profile of the sleek, shiny chassis only allows for small fans on the back. So, power is King. If the device consumes power, it had better do something useful with it, and not just dissipate it away, as waste heat! While the specs give us a good clue to the performance of the HD5450, ultimately, it's the real-world performance we care about, the design tradeoffs that were required to achieve it, and the price for that performance. This graphic puts the general pricing v. performance strategy in perspective:

ATI_RADEON_HD5450_VIDEO_CARD_Family-Timeline.jpg

But before we get to our detailed teardown and testing, let's look at the actual HD5450 specs:

Radeon HD5450 Specifications

  • Engine clock speed: 650 MHz
  • Processing power (single precision): 104 GigaFLOPS
  • Texel fill rate (bilinear filtered): 5.2 Gigatexels/sec
  • Pixel fill rate: 2.6 Gigapixels/sec
  • Anti-aliased pixel fill rate: 10.4 Gigasamples/sec
  • Memory clock speed: 800 MHz
  • Memory data rate: 1.8 Gbps
  • Memory bandwidth: 12.8 GB/sec
  • Maximum board power: 19.1 Watts
  • Idle board power: 6.4 Watts

OK, tired of looking at numbers, let's take a closer look at the hardware, then? ATI was kind enough to send us a sample of the 512MB reference design video card with passive cooling, so let's examine the ins and outs.

Closer Look: Radeon HD5450

The HD5450 is a radically different design from the previous HD5xxx video cards. From several perspectives, the intended application for this diminutive card has driven a different approach. Physically, everything is smaller, to accommodate usage in low-profile HTPC cases. The sample card I received has a full height I/O bracket, but the DE15HD VGA connector is wired with ribbon cable and is removable, or it can be relocated to an adjacent slot. This way, the card can be easily reconfigured to a true half-height form factor. The card itself is only about an inch longer than the PCI-e interface that it connects to.

ATI_RADEON_HD5450_VIDEO_CARD_Top_Rear_34_01.jpg

Power consumption is at an all-time low, primarily to avoid generating heat. Again, this is an optimum solution for HTPC, where lower power means less ventilation and, ultimately, less overall noise. The video card we're looking at here is passively cooled, with a lovely red anodized heat sink that is reminiscent of the ASUS ROG heat sinks, used on their gaming oriented motherboards. ATI's AIB partners will have considerable flexibility for their cooling solutions, allowing them to optimize their offerings along several product pathways. I expect some fan-cooled designs to be recycled, in order to reduce costs, where possible. But for now, we can gaze upon one of the most attractive passive cooling solutions I've seen on a video card.

ATI_RADEON_HD5450_VIDEO_CARD_IO_Bottom_34_01.jpg

The connections on the I/O plate at the rear of the card are arranged in a common configuration for this class of video card. From left to right: one VGA, one HDMI and one DVI connector - one for everyone. This card did not have a DisplayPort connection, which is required for a three monitor Eyefinity setup. There will be some flexibility in the I/O port arrangement for ATI partners, so pay attention to the product specs when you buy, as it can be hard to tell the HDMI and DisplayPort connections apart with a casual glance.

ATI_RADEON_HD5450_VIDEO_CARD_IO_Plate_01.jpg

One thing that becomes obvious when looking at the end-on image above is that this particular passive cooling hardware consumes more than one expansion slot. That may be an issue for some, but remember, there will be a variety of options available from the AIB partners.

The back of the Radeon HD5450 is not quite as densely packed as some of the uber-cards we test here at Benchmark Reviews, but it's about half-filled with miniature surface-mount-technology components. The main feature to be seen here is the metal cross-brace for the GPU heatsink screws, which are spring loaded, and connect to threaded standoffs on the heat sink assembly on the front side of the card. Also, note that back side DRAM is used, even for the 512MB version of the card.

ATI_RADEON_HD5450_VIDEO_CARD_Back_Side_01.jpg

For most high-end video cards, the cooling system is an integral part of the performance envelope for the card. Make it run cooler, and you can make it run faster was always the byword for achieving gaming-class performance from the latest and greatest GPU. The HD5450 takes a completely different path, more appropriate for the HTPC application it will most likely be used for. By greatly reducing the number of Stream Processors and ROP units, they have built a GPU that consumes so little power, and generates so little waste heat, that cooling is not the limiting factor in performance.

ATI_RADEON_HD5450_VIDEO_CARD_Specs.jpg

That's all there really is to see on the outside, so let's peel back the covers and have a good look around on the inside.

Radeon HD5450 Detailed Features

The big news about the HD5450 is the reduced size of the GPU die. Once again, the newest member of the HD5xxx family has roughly half the number of transistors as the previous one. The chip in the HD5450 is codenamed "Cedar", and has approximately 292 million transistors on it, compared to 627 million on the Redwood chip, which was released last month with the HD5670. The small size is critical to the cost strategy that ATI is pursuing with all these new releases. After some very lean years, struggling to make it in the graphics chip industry, it appears ATI has finally figured out how to make money.

The Cedar die packaging on the HD5450 GPU is a little bigger than half of the Redwood, because the number of interconnects is roughly the same, with the notable exception of the memory interface, which is 64 bits wide this time. The number of Stream Processors has been radically reduced, from 400 down to 80; only 20% of the number present on the Redwood chips. It's evident there are a lot of transistors consumed in the other functions of the GPU, besides shaders.

ATI_RADEON_HD5450_VIDEO_CARD_Cedar_Die_Dime_02.jpg

The memory specification is going to be somewhat flexible for the HD54xx products. I expect most units will be sold with 512MB of GDDR3, but GDDR2 is a possibility, perhaps for OEM variants. 1 GB versions will also be available, although it may be tough to make the case for any performance advantage to be had with that configuration, with the exception of Eyefinity usage. As mentioned above, the GPU-memory interface is 64 bits wide with a maximum bandwidth of 12.8 GB/s, using GDDR3. That's a major hit, compared to the HD5670, with 64GB/s of bandwidth, but I suspect it's well balanced by the greatly reduced number of Stream Processors.

ATI_RADEON_HD5450_VIDEO_CARD_SAMSUNG_Memory_Chip.jpg

The K4W1G1646E-HC11 GDDR3 memory chips are sourced from Samsung. They are rated for a maximum clock rate of 900 MHz, and the marketing specs for the card indicate a maximum clock rate of 800 MHz. Version 0.3.8 of GPU-Z reported that the memory on my sample unit was running at 900 MHz. That's engineering samples for you.... I was suspicious that the clock rate is not being reported correctly in GPU-Z, since it has not been updated to support this new chip yet, but ATI confirmed that they had been building some "extra curricular" prototypes for testing, and some of them got into reviewer's hands. Oh well; their loss and my gain, I guess. Too bad they didn't send me some 1 GHz parts, as they are available from Samsung, according to the product specs shown here below.

ATI_RADEON_HD5450_VIDEO_CARD_samsung_memory_specs.png

The power section of the HD5450 video card is simplistic, and optimized for both cost and low power. In this case, all of the dynamic performance scaling is built into the GPU, and the voltage regulators just ride along. I was observing the shader and memory clocks in GPU-Z while using the PC for normal office-type duties, and this card ramped the clocks up and down faster and more dramatically than any card I've used recently. That's where the power savings are going to be made with this card, getting it down very quickly to idle power, which is a miserly 6.4 watts.

Looking at the business end of the passive cooling, I found something that made my mechanical engineering heart shiver. That little square platform, located between the four threaded standoffs, is created by cutting away all the aluminum around it with a milling machine. Then the platform itself is milled to create the flat, smooth surface we all know is essential for good heat transfer from the mirror surface of the GPU. The word you should be thinking is, "Expensive". I'm pretty sure that the way the GPU chip is mounted on the board makes this expensive manufacturing step essential, and I'm also sure that somewhere there are a couple electronics packaging engineers delivering some serious noogies to the PC Board designer that made all this necessary.

ATI_RADEON_HD5450_VIDEO_CARD_Heatsink_Milled_Face.jpg

The assembly quality is quite good, for an engineering sample. The heat sink had a bit of a gouge in it, but it was put there before the part was anodized, so it doesn't stick out too much. The soldering and surface mount component placement was reasonably well done, and the overall board layout was well designed, with a rational flow.

ATI_RADEON_HD5450_VIDEO_CARD_Assembly_Q_PWR.jpg

Before we dive into the testing portion of the review, let's look at one of the most exciting new features available on every Radeon HD5xxx series product, Eyefinity.

ATI Eyefinity Multi-Monitors

Even at this low price point, ATI felt that people might want to take advantage of the new Eyefinity technology. Especially if you look at this product as an upgrade part for someone with OEM or IGP video, they can add a second monitor at the same time and really improve their gaming and HD video experience.

ATI_RADEON_HD5450_3_Main_Features_01.jpg

ATI Eyefinity advanced multiple-display technology launches a new era of panoramic computing, helping to boost productivity and multitasking with innovative graphics display capabilities supporting massive desktop workspaces, creating ultra-immersive computing environments with superhigh resolution gaming and entertainment, and enabling easy configuration. High end editions will support up to six independent display outputs simultaneously.

In the past, multi-display systems catered to professionals in specific industries. Financial, energy, and medical are just some industries where multi-display systems are a necessity. Today, more and more graphic designers, CAD engineers and programmers are attaching more than one display to their workstation. A major benefit of a multi-display system is simple and universal - it enables increased productivity. This has been confirmed in industry studies which show that attaching more than one display device to a PC can significantly increase user productivity.

Early multi-display solutions were non-ideal. Bulky CRT monitors claimed too much desk space; thinner LCD monitors were very expensive; and external multidisplay hardware were inconvenient and also very expensive. These issues are much less of a concern today. LCD monitors are very affordable and current generation GPUs can drive multiple display devices independently and simultaneously, without the need for external hardware. Despite the advancements in multi-display technology, AMD engineers still felt there was room for improvement, especially regarding the display interfaces. VGA carries analog signals and needs a dedicated DAC per display output, which consumes power and ASIC space. Dual-Link DVI is digital, but requires a dedicated clock source per display output and uses too many I/O pins from the GPU. It was clear that a superior display interface was needed.

In 2004, a group of PC companies collaborated to define and develop DisplayPort, a powerful and robust digital display interface. At that time, engineers working for the former ATI Technologies Inc. were already thinking about a more elegant solution to drive more than two display devices per GPU, and it was clear that DisplayPort was the interface of choice for this task. In contrast to other digital display interfaces, DisplayPort does not require a dedicated clock signal for each display output. In fact, the data link is fixed at 1.62Gbps or 2.7Gbps per lane, irrespective of the timing of the attached display device. The benefit of this design is that one reference clock source provides the clock signal needed to drive as many DisplayPort display devices as there are display pipelines in the GPU. In addition, with the same number of I/O pins used for Single-Link DVI, a full speed DisplayPort link can be driven which provides more bandwidth and translates to higher resolutions, refresh rates and color depths. All these benefits perfectly complement ATI Eyefinity Multi-Display Technology.

ATI_RADEON_HD5450_Eyefinity_01.jpg

ATI Eyefinity Technology from AMD provides advanced multiple monitor technology delivering an immersive graphics and computing experience, supporting massive virtual workspaces and super-high resolution gaming environments. Legacy GPUs have supported up to two display outputs simultaneously and independently for more than a decade. Until now graphics solutions have supported more than two monitors by combining multiple GPUs on a single graphics card. With the introduction of AMD's next-generation graphics product series supporting DirectX 11, a single GPU now has the advanced capability of simultaneously supporting up to six independent display outputs.

ATI Eyefinity Technology is closely aligned with AMD's DisplayPort implementation providing the flexibility and upgradability modern user's demand. Up to two DVI, HDMI, or VGA display outputs can be combined with DisplayPort outputs for a total of up to six monitors, depending on the graphics card configuration. The initial AMD graphics products with ATI Eyefinity technology will support a maximum of three independent display outputs via a combination of two DVI, HDMI or VGA with one DisplayPort monitor. AMD has a future product planned to support up to six DisplayPort outputs. Wider display connectivity is possible by using display output adapters that support active translation from DisplayPort to DVI or VGA.

The DisplayPort 1.2 specification is currently being developed by the same group of companies who designed the original DisplayPort specification. Its feature set includes higher bandwidth, enhanced audio and multi-stream support. Multi-stream, commonly referred to as daisy-chaining, is the ability to address and drive multiple display devices through one connector. This technology, coupled with ATI Eyefinity Technology, will be a key enabler for multi-display technology, and AMD will be at the forefront of this transition.

Video Card Testing Methodology

This is the beginning of a new era for testing at Benchmark Reviews. With the remarkably quick adoption rate of Windows7, and given the prolonged and extensive pre-release testing that occurred on a global scale, there are compelling reasons to switch all testing to this new, and highly anticipated, operating system. Overall performance levels of Windows 7 have been favorably compared to Windows XP, and there is solid support for the 64-bit version, something enthusiasts have been anxiously awaiting for several years.

Our site polls and statistics indicate that the over 90% of our visitors use their PC for playing video games, and practically every one of you are using a screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. All of the benchmark applications are capable of utilizing DirectX 10, and that is how they were tested. Some of these benchmarks have been used widely for DirectX 9 testing in the XP environment, and it is critically important to differentiate between results obtained with different versions. Each game behaves differently in DX9 and DX10 formats. Crysis is an extreme example, with frame rates in DirectX 10 only about half what was available in DirectX 9.

At the start of all tests, the previous display adapter driver is uninstalled and trace components are removed using Driver Cleaner Pro. We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. According to the Steam Hardware Survey published at the time of Windows 7 launch, the most popular gaming resolution is 1280x1024 (17-19" standard LCD monitors) closely followed by 1024x768 (15-17" standard LCD). Given the mainstream user base that this card is aimed at, we limited our testing to 1280x1024 and the commonly encountered wide-screen format of 1680x1050 (20-22" wide-screen LCD). Even at these modest resolutions, our low-end cards that we dug out of storage struggled mightily with the modern game titles we normally test with.

Each benchmark test program begins after a system restart, and the very first result for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, the high and low results are discarded, and the average of the three remaining results is displayed in our article.

ATI_RADEON_HD5450_3_Main_Features_02_684_wide.jpg

Test System

  • Motherboard: ASUS M4A79T Deluxe (2205 BIOS)
  • System Memory: 2X 2GB OCZ Reaper HPC DDR3 1600MHz (7-7-7-24)
  • Processor: AMD Phenom II 720 Black Edition (Overclocked to 3.8 GHz)
  • CPU Cooler: CoolerMaster Hyper Z600
  • Video: ATI Radeon HD5450, Engineering Sample
  • Drive 1: OCZ Summit SSD, 60GB
  • Optical Drive: Sony NEC Optiarc AD-7190A-OB 20X IDE DVD Burner
  • Enclosure: CM STORM Sniper Gaming Case
  • PSU: Corsair CMPSU-750TX ATX12V V2.2 750Watt
  • Monitor: SOYO 24"; Widescreen LCD Monitor (DYLM24E6) 1920X1200
  • Operating System: Windows 7 Ultimate Version 6.1 (Build 7600)

Benchmark Applications

  • 3DMark Vantage v1.0.1 (8x Anti Aliasing & 16x Anisotropic Filtering)
  • Crysis v1.21 Benchmark (High Settings, 0x and 4x Anti-Aliasing)
  • Devil May Cry 4 Benchmark Demo (Ultra Quality, 8x MSAA)
  • Far Cry 2 v1.02 (Very High Performance, Ultra-High Quality, 8x Anti-Aliasing)
  • Resident Evil 5 (8x Anti-Aliasing, Motion Blur ON, Quality Levels-High)

Video Card Test Products

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

  • Foxconn GeForce 8400GS (8400GS-256 Forceware v195.62 WHQL)
  • ATI Radeon HD5450 (Mfr. Sample Catalyst 8.69 RC3)
  • EVGA GeForce 8600GT (256-P2-N751-TR Forceware v195.62 WHQL)
  • ATI Radeon HD5670 (Mfr. Sample Catalyst 8.69 RC3)
  • MSI Radeon HD4830 (R4830 T2D512 Catalyst 8.69_RC3)
  • XFX Radeon HD5750 (HD-575X-ZN Catalyst 8.69_RC3)

Now we're ready to begin testing video game performance these video cards, so please continue to the next page where we start off with our 3DMark Vantage results.

3DMark Vantage Benchmark Results

3DMark Vantage is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 10 performance of 3D game performance with graphics cards. A 3DMark score is an overall measure of your system's 3D gaming capabilities, based on comprehensive real-time 3D graphics and processor tests. By comparing your score with those submitted by millions of other gamers you can see how your gaming rig performs, making it easier to choose the most effective upgrades or finding other ways to optimize your system.

There are two graphics tests in 3DMark Vantage: Jane Nash (Graphics Test 1) and New Calico (Graphics Test 2). The Jane Nash test scene represents a large indoor game scene with complex character rigs, physical GPU simulations, multiple dynamic lights, and complex surface lighting models. It uses several hierarchical rendering steps, including for water reflection and refraction, and physics simulation collision map rendering. The New Calico test scene represents a vast space scene with lots of moving but rigid objects and special content like a huge planet and a dense asteroid belt.

At Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, 3DMark is a reliable tool for comparing graphic cards against one-another.

1680x1050 is rapidly becoming the new 1280x1024. More and more widescreen are being sold with new systems or as upgrades to existing ones. Even in tough economic times, the tide cannot be turned back; screen resolution and size will continue to creep up. These two, relatively low, resolutions are the most appropriate for a review of mainstream hardware, and we'll also be using the following reduced settings for 3DMark Vantage: No Anti-Aliasing, 2x Anisotropic Filtering, all quality levels at Entry, and Post Processing Scale set at 1:2.

ATI_RADEON_HD5450_3DMark_Vantage_Jane_Nash.jpg

Test one, all about the exploits of our fictional espionage heroine Jane Nash, has some wonderful graphics when quality levels are cranked up. The water modeling is exceptionally accurate and detailed. This is only a synthetic benchmark, so the results we get, in terms of frames-per-second are not always typical of real world gaming performance, but still.....I was hoping for somewhat better performance from the reduced specification hardware. Just for the record, the HD5450 slots in between the GeForce 8400GS and 8600GT for raw 3D graphics performance, but I have serious doubts if any of these cards will be able to hack it when we start up the actual gaming applications.

ATI_RADEON_HD5450_3DMark_Vantage_New_Calico.jpg

Test two is a little more challenging for most video cards, due to the large number of irregularly shaped asteroids that need to be rendered in New Calico. Once again, the HD5450 just barely gets out of the gate, with average FPS numbers in the low single digits. Dropping down to 1280x1024 doesn't really help. We really need to look at actual gaming performance to verify these results, so let's take a look in the next section, at how these cards stack up in the traditional standard bearer for gaming benchmarks, Crysis.

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

Crysis Benchmark Results

Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX 10) framework, but can also run using DirectX 9, on Vista, Windows XP and the new Windows 7. As we'll see, there are significant frame rate reductions when running Crysis in DX10. It's not an operating system issue, DX9 works fine in WIN7, but DX10 knocks the frame rates in half.

Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.

Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.

Once again, we are going to concentrate on relatively low-resolution testing performance. At the 1280x1024 resolution used by 17"; and 19"; monitors, the CPU and memory may have some influence on the results, but our test rig in this case is well above the specs that a typical HD5450 user will have, so we've eliminated that variable. At 1280x1024, and the widescreen resolutions of 1680x1050, the performance differences between the video cards under test are mostly down to the cards.

ATI_RADEON_HD5450_Crysis_NoAA.jpg

The results here are pretty consistent with 3DMark Vantage, in that the HD5450 is stuck in a range close to, but below the teens. For most users, the performance of the HD5450 in this challenging scenario is going to be frustrating, ultimately unacceptable, and bordering on painful. Comparing the HD5450 directly against its higher priced siblings shows that performance goes up almost exponentially. The HD5670 costs twice as much, but has almost ten times the performance in this real-world application. Anyone looking to play a little Crysis on their 1080P HTPC needs to look at either the HD5670 or perhaps the soon-to-be-released HD5570 as a minimum starting point.

ATI_RADEON_HD5450_Crysis_4xAA.jpg

Once a decent amount of anti-aliasing is factored in, the situation degrades even further. Frame rates are way below acceptable until you get close to the $100 mark with the HD5670. If you want to play this game in DX10 with eye candy turned on, you are going to have to pay, both with cash as well as increased heat and power consumption.

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

In our next section, Benchmark Reviews tests with Devil May Cry 4 Benchmark. Read on to see how a blended high-demand GPU test with low video frame buffer demand will impact our test products.

Devil May Cry 4 Benchmark

Devil May Cry 4 was released for the PC platform in early 2007 as the fourth installment to the Devil May Cry video game series. DMC4 is a direct port from the PC platform to console versions, which operate at the native 720P game resolution with no other platform restrictions. Devil May Cry 4 uses the refined MT Framework game engine, which has been used for many popular Capcom game titles over the past several years.

MT Framework is an exclusive seventh generation game engine built to be used with games developed for the PlayStation 3 and Xbox 360, and PC ports. MT stands for "Multi-Thread", "Meta Tools" and "Multi-Target". Originally meant to be an outside engine, but none matched their specific requirements in performance and flexibility. Games using the MT Framework are originally developed on the PC and then ported to the other two console platforms.

On the PC version a special bonus called Turbo Mode is featured, giving the game a slightly faster speed, and a new difficulty called Legendary Dark Knight Mode is implemented. The PC version also has both DirectX 9 and DirectX 10 mode for Windows XP, Vista, and Widows 7 operating systems.

It's always nice to be able to compare the results we receive here at Benchmark Reviews with the results you test for on your own computer system. Usually this isn't possible, since settings and configurations make it nearly difficult to match one system to the next; plus you have to own the game or benchmark tool we used.

Devil May Cry 4 fixes this, and offers a free benchmark tool available for download. Because the DMC4 MT Framework game engine is rather low-demand for today's cutting edge video cards, Benchmark Reviews used the 1680x1050 resolution to test with 8x AA (highest AA setting available to Radeon HD video cards) and 16x AF.

ATI_RADEON_HD5450_DMC4_DX10.jpg

Devil May Cry 4 is not as demanding a benchmark as it used to be. Only scene #2 and #4 are worth looking at from the standpoint of trying to separate the fastest video cards from the slower ones. Still, it represents a typical environment for many games that our readers still play on a regular basis, so it's good to see what works with it and what doesn't. Any of the tested cards will do a credible job in this application, and the performance scales in a pretty linear fashion. You get what you pay for when running this game, at least for benchmarks. This is one time where you can generally use the maximum available anti-aliasing settings, so NVIDIA users should feel free to crank it up to 16X. The DX10 "penalty" is of no consequence here.

The HD5450, once again performs better than the 8400GS, but slightly lower than the 8600GT. As usual, we get better frame rates in this test, but the three low-end cards still can't make it up to the recommended 30FPS minimum. This is one case where you can achieve full performance with a mainstream graphics card, but the HD5450 is so tightly optimized for HTPC usage that it won't be successful in this more demanding role.

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

Our next benchmark of the series is for a very popular FPS game that rivals Crysis for world-class graphics, and is more representative of games optimized for DirectX10. Maybe we'll get some relief there...

Far Cry 2 Benchmark Results

Ubisoft has developed Far Cry 2 as a sequel to the original, but with a very different approach to game play and story line. Far Cry 2 features a vast world built on Ubisoft's new game engine called Dunia, meaning "world", "earth" or "living" in Farci. The setting in Far Cry 2 takes place on a fictional Central African landscape, set to a modern day timeline.

The Dunia engine was built specifically for Far Cry 2, by Ubisoft Montreal development team. It delivers realistic semi-destructible environments, special effects such as dynamic fire propagation and storms, real-time night-and-day sun light and moon light cycles, dynamic music system, and non-scripted enemy A.I actions.

The Dunia game engine takes advantage of multi-core processors as well as multiple processors and supports DirectX 9 as well as DirectX 10. Only 2 or 3 percent of the original CryEngine code is re-used, according to Michiel Verheijdt, Senior Product Manager for Ubisoft Netherlands. Additionally, the engine is less hardware-demanding than CryEngine 2, the engine used in Crysis. However, it should be noted that Crysis delivers greater character and object texture detail, as well as more destructible elements within the environment. For example; trees breaking into many smaller pieces and buildings breaking down to their component panels. Far Cry 2 also supports the amBX technology from Philips. With the proper hardware, this adds effects like vibrations, ambient colored lights, and fans that generate wind effects.

There is a benchmark tool in the PC version of Far Cry 2, which offers an excellent array of settings for performance testing. Benchmark Reviews used slightly lower settings for this test, with the resolution set to 1280x1024 and 1680x1050. The performance settings were all set to 'Medium', Render Quality was set to 'Optimum' (which was the same as "High" in this case), 4x anti-aliasing was applied, and HDR and Bloom were enabled. Of course DX10 was used exclusively for this series of tests.

ATI_RADEON_HD5450_Far_Cry_2_DX10.jpg

Although the Dunia engine in Far Cry 2 is slightly less demanding than CryEngine 2 engine in Crysis, the strain appears to be similar. Far Cry 2 also seems to have been optimized for, or at least written with a clear understanding of, DX10 requirements.

Using the short 'Ranch Small' time demo (which yields the lowest FPS of the three tests available), only the more robust video cards are capable of producing playable frame rates with moderate settings applied. The Radeon HD5450 is stuck in the lower range again. Although the Dunia engine seems to be optimized for NVIDIA chips, I can't lay the blame there, as the 8400GS and 8600GT don't really do much better. Once again, the HD5670 represents a reasonable jumping off point for the lowest cost choice that will still perform reliably.

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

Our next benchmark of the series puts our collection of video cards against some very demanding graphics in the newly released Resident Evil 5 benchmark.

Resident Evil 5 Benchmark Results

PC gamers get the ultimate Resident Evil package in this new PC version with exclusive features including NVIDIA's new GeForce 3D Vision technology (wireless 3D Vision glasses sold separately), new costumes and a new mercenaries mode with more enemies on screen. Delivering an infinite level of detail, realism and control, Resident Evil 5 is certain to bring new fans to the series. Incredible changes to game play and the world of Resident Evil make it a must-have game for gamers across the globe.

Years after surviving the events in Raccoon City, Chris Redfield has been fighting the scourge of bio-organic weapons all over the world. Now a member of the Bio-terrorism Security Assessment Alliance (BSSA), Chris is sent to Africa to investigate a biological agent that is transforming the populace into aggressive and disturbing creatures. New cooperatively-focused game play revolutionizes the way that Resident Evil is played. Chris and Sheva must work together to survive new challenges and fight dangerous hordes of enemies.

From a gaming performance perspective, Resident Evil 5 uses Next Generation of Fear - Ground breaking graphics that utilize an advanced version of Capcom's proprietary game engine, MT Framework, which powered the hit titles Devil May Cry 4, Lost Planet and Dead Rising. The game uses a wider variety of lighting to enhance the challenge. Fear Light as much as Shadow - Lighting effects provide a new level of suspense as players attempt to survive in both harsh sunlight and extreme darkness. As usual, we maxed out the graphics settings on the benchmark version of this popular game, to put the hardware through its paces. Much like Devil May Cry 4, it's relatively easy to get good frame rates in this game, so take the opportunity to turn up all the knobs and maximize the visual experience.

ATI_RADEON_HD5450_Resident_Evil_5_DX10.jpg

The Resident Evil5 benchmark tool provides a graph of continuous frame rates and averages for each of four distinct scenes. In addition it calculates an overall average for the four scenes. The averages for scene #3 and #4 are what we report here, as they are the most challenging.

This new game is still not quite playable with the HD5450 hardware, even at lower screen resolutions. You still need to use one of the higher powered cards to achieve acceptable frame rates. Even though it caught up to the 8600GT in this case, the results are still way too low to be useable.

Product Series

Foxconn GeForce 8400GS (8400GS-256)

ATI Radeon HD5450 (Mfr. Sample)

EVGA GeForce 8600GT (256-P2-N751-TR)

ATI Radeon HD5670 (Mfr. Sample)

MSI Radeon HD4830 (R4830 T2D512)

XFX Radeon HD5750 (HD-575X-ZN)

Stream Processors

16

80

32

400

640

720

Core Clock (MHz)

450

650

540

775

585

700

Shader Clock (MHz)

900

N/A

1180

N/A

N/A

N/A

Memory Clock (MHz)

800

800

700

1000

900

1150

Memory Amount

256MB - GDDR2

512MB - GDDR3

256MB-GDDR3

51MB-GDDR5

512MB-GDDR5

1024MB - GDDR5

Memory Interface

64-bit

64-bit

128-bit

128-Bit

256-Bit

128-bit

Our next sections look at thermal performance and power consumption, both key qualities for this new product.

ATI Radeon HD5450 Temperature

It's hard to know exactly when the first video card got overclocked, and by whom. What we do know is that it's hard to imagine a computer enthusiast or gamer today that doesn't overclock their hardware. Of course, not every video card has the head room. Some products run so hot that they can't suffer any higher temperatures than they generate straight from the factory. This is why we measure the operating temperature of the video card products we test.

To begin testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark 1.7.0 to generate maximum thermal load and record GPU temperatures at high-power 3D mode. The ambient room temperature remained stable at 23C throughout testing. The ATI Radeon HD5450 video card recorded 30C in idle 2D mode, and increased to 43C after 20 minutes of stability testing in full 3D mode, at 1920x1200 resolution and the maximum MSAA setting of 8X. Obviously, there were no fan settings for this test, but the case I tested in has a large side panel fan.

43C is an impressive result for temperature stress testing, especially for a card that relies on passive cooling. This is a key performance measure for a card like this, and it delivers the goods.

FurMark is an OpenGL benchmark that heavily stresses and overheats the graphics card with fur rendering. The benchmark offers several options allowing the user to tweak the rendering: fullscreen / windowed mode, MSAA selection, window size, duration. The benchmark also includes a GPU Burner mode (stability test). FurMark requires an OpenGL 2.0 compliant graphics card with lot of GPU power! As an oZone3D.net partner, Benchmark Reviews offers a free download of FurMark to our visitors.

ATI_RADEON_HD5450_VIDEO_CARD_furmark_temp.jpg

FurMark does do two things extremely well: drive the thermal output of any graphics processor higher than any other application or video game, and it does so with consistency every time. While FurMark is not a true benchmark tool for comparing different video cards, it still works well to compare one product against itself using different drivers or clock speeds, or testing the stability of a GPU, as it raises the temperatures higher than any program. But in the end, it's a rather limited tool.

In our next section, we discuss electrical power consumption and learn how well (or poorly) each video card will impact your utility bill...

VGA Power Consumption

Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now.

ATI_RADEON_HD5450_VIDEO_CARD_GPU-Z.png

To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:

Video Card Power Consumption by Benchmark Reviews

VGA Product Description

(sorted by combined total power)

Idle Power

Loaded Power

NVIDIA GeForce GTX 480 SLI Set
82 W
655 W
NVIDIA GeForce GTX 590 Reference Design
53 W
396 W
ATI Radeon HD 4870 X2 Reference Design
100 W
320 W
AMD Radeon HD 6990 Reference Design
46 W
350 W
NVIDIA GeForce GTX 295 Reference Design
74 W
302 W
ASUS GeForce GTX 480 Reference Design
39 W
315 W
ATI Radeon HD 5970 Reference Design
48 W
299 W
NVIDIA GeForce GTX 690 Reference Design
25 W
321 W
ATI Radeon HD 4850 CrossFireX Set
123 W
210 W
ATI Radeon HD 4890 Reference Design
65 W
268 W
AMD Radeon HD 7970 Reference Design
21 W
311 W
NVIDIA GeForce GTX 470 Reference Design
42 W
278 W
NVIDIA GeForce GTX 580 Reference Design
31 W
246 W
NVIDIA GeForce GTX 570 Reference Design
31 W
241 W
ATI Radeon HD 5870 Reference Design
25 W
240 W
ATI Radeon HD 6970 Reference Design
24 W
233 W
NVIDIA GeForce GTX 465 Reference Design
36 W
219 W
NVIDIA GeForce GTX 680 Reference Design
14 W
243 W
Sapphire Radeon HD 4850 X2 11139-00-40R
73 W
180 W
NVIDIA GeForce 9800 GX2 Reference Design
85 W
186 W
NVIDIA GeForce GTX 780 Reference Design
10 W
275 W
NVIDIA GeForce GTX 770 Reference Design
9 W
256 W
NVIDIA GeForce GTX 280 Reference Design
35 W
225 W
NVIDIA GeForce GTX 260 (216) Reference Design
42 W
203 W
ATI Radeon HD 4870 Reference Design
58 W
166 W
NVIDIA GeForce GTX 560 Ti Reference Design
17 W
199 W
NVIDIA GeForce GTX 460 Reference Design
18 W
167 W
AMD Radeon HD 6870 Reference Design
20 W
162 W
NVIDIA GeForce GTX 670 Reference Design
14 W
167 W
ATI Radeon HD 5850 Reference Design
24 W
157 W
NVIDIA GeForce GTX 650 Ti BOOST Reference Design
8 W
164 W
AMD Radeon HD 6850 Reference Design
20 W
139 W
NVIDIA GeForce 8800 GT Reference Design
31 W
133 W
ATI Radeon HD 4770 RV740 GDDR5 Reference Design
37 W
120 W
ATI Radeon HD 5770 Reference Design
16 W
122 W
NVIDIA GeForce GTS 450 Reference Design
22 W
115 W
NVIDIA GeForce GTX 650 Ti Reference Design
12 W
112 W
ATI Radeon HD 4670 Reference Design
9 W
70 W
* Results are accurate to within +/- 5W.

The ATI Radeon HD5450 pulled 7 (137-130) watts at idle and 36 (166-130) watts when running full out, using the test method outlined above. These numbers are very close to the factory numbers of 6.4W at idle and 19.1W under load. With the video cards power demands dropping so low, this type of test starts to show its limitations. Nevertheless, the numbers are still in the ballpark.

I also tested power consumption while streaming 1080P video from YouTube. I waited until the clips were fully downloaded, and ran them full screen on my 1920x1200 monitor. I configured the graphics settings according to our helpful guide here on Benchmark Reviews, so that the bulk of the work was handled by the GPU, and also tweaked the visual settings to get maximum image quality. In the application it was designed for, mainly video streaming and HTPC, the Radeon HD5450 consumed slightly less power than it did during the FurMark stress test. Maximum power draw during 1080P video playback was 22 (152-130) watts, only two thirds of the power required to run FurMark.

Radeon HD5450 Final Thoughts

Most everyone who reads this site is familiar with the concept of diminishing returns. As you get closer to the highest level of performance (let's call that 100 %...), it costs considerably more to get the last 10% of performance than it does to go from 80% to 90%. When you look at two gaming-class video cards using the same technology, the increase in frames-per-second doesn't match the increase in price. The HD5850 and the HD5870 are a good example; does the 33% increase in price give you a 33% increase in performance? You wish.....which is why lots more people are buying the HD5850.

Just as the law of diminishing returns works on the high end of the market, there is a corresponding force at work on the lower end of the scale. As you move closer to the lowest possible cost, you start to bounce up against fixed costs that won't budge. Marketing, sales, design, testing, certification processes, transportation, packaging, and connector costs are all stubbornly rigid. Right now, the cheapest cards at Newegg, based on NVIDIA and ATI chips are the 8400GS and HD4350, priced at $30 and $36, respectively. I dare say, we're not going to see any new cards introduced that will be any cheaper than these are; it's just not fiscally possible, if we assume that the vendor is going to make a profit.

My point is, the vendor can try and cut every possible feature, performance enhancing hardware, included software, industrial design, packaging costs, etc. and end up with a product that barely functions, and it would still probably cost $25 on the retailer's shelf.

In my review of the HD5670, I wondered out loud, "How many more times ATI can slice the pie and still come up with a fully functional video card? Could there be one more cut, for an ultra-low power solution? But I think this is probably it, for a card that can honestly support gaming applications as well as general usage and HD video." As it turns out, the Radeon HD5450 is that fully functional low-power card, and I still think the Redwood class of ATI GPUs is the lowest you can go and still support modern games. The game changes when you look at HD video, however. This card eats it up for breakfast, and still has some headroom left over for whatever HW acceleration scheme comes along next.

ATI_RADEON_HD5450_VIDEO_CARD_Big_Bunny_02.jpg

As I sit here on the edge of my chair, waiting for dribs and drabs of information about the latest monster-sized GPU chip from NVIDIA (...hey, they named them this week. Wow), with a die size approaching the dimensions of the original Post-It note, I did wonder what the attraction was to a discrete graphics card with a GPU that's less than half the size of a US dime. The answer is that even the best Integrated Graphics Processor (IGP) is still less than half as powerful as the Radeon HD5450, and they generally max out with 128MB of SidePort GDDR3 memory. Many of them struggle to render full HD 1080P video smoothly, and the CPUs that they are bundled with usually can't help the effort much.

So, grab that old microATX board out of the closet, dust it off, add the Radeon HD5450, drop it into a shiny new, slim line HTPC box and you're off to the movies in style.

ATI Radeon HD5450 Conclusion

Looking at the performance of the ATI Radeon HD5450, you have to give up the idea that this is going to be any kind of solution for a gaming rig. In modern FPS games, it was well below any reasonable person's expectation for visual quality. Even at the reduced resolutions and quality levels that we introduced in our review of the HD5670 and GT240, the HD5450 just barely got into double digits for frames-per-second. This card is not really practical as a multi-purpose solution. We'll have to wait a bit for the HD55xx to see if it's possible to successfully bridge the two requirements of gaming and video playback. The strength of the HD5450 lies in Home Theater PC usage only, where it performs superbly. ATI is currently leading the game in image quality for HD video, and this small, low power, silent and cool board supports all the latest software enhancements that make those improved visuals possible.

The appearance of the passively cooled HD5450 is visually stunning. There are some really ugly passive cards out there, along with a few decent looking ones, but nothing comes close to the design statement that this one makes. AIB partners will have pretty much total flexibility to implement their own cooling systems, and I don't expect any one of them to top this. Batmobile indeed; this one is fine art, of the industrial design variety.

ATI_RADEON_HD5450_VIDEO_CARD_blk_lg.jpg

The build quality of the Radeon HD5450 was quite good, for an engineering sample. The parts were all high quality, the soldering and component placement were to a high standard, and the heat sink was manufactured and assembled perfectly.

The features of the HD5450 have been carried over in full measure from the very first HD58xx series: DirectX 11, Full ATI Eyefinity Support, ATI Stream Technology Support, DirectCompute 11 and OpenCL Support, HDMI 1.3a with Dolby True HD and DTS Master Audio. Nothing was left out on this card, despite it being produced for a clearly different role than the original barn burner gaming cards. Even though this card will not thrive in a multi-functional role, it still provides a solid HTPC experience and is a considerable upgrade for many systems still relying on IGP.

As of March 2010 there are several models available at different prices for the Radeon HD 5450, depending on DRAM configuration and cooling solution. PowerColor offers the AX5450 for $40, while the Sapphire 100291L lists for $43 and XFX HD5450 sells for $50. This is a small price premium from the lowest priced cards available from our favorite e-tailer, but launch pricing is always a bit high, for obvious reasons. We saw in our gaming tests that it takes an extra $50-70 to get decent results with challenging titles, but the extra performance also buys you higher power requirements, more noise and more heat.

The ATI Radeon HD5450 earns a Silver Tachometer Award, because there are some buyers that absolutely demand a passively cooled, completely silent video card, and they also need that card to support the latest technology and features for HD video playback. Until now, those two requirements were mutually exclusive; now there is a product; the one and only product, which completely meets their needs. The fact that it's impossible to build a dual-use card with 40nm technology that does all that and can play FPS games convincingly is a shame. Fortunately, we'll only have to wait a year or so, to see what 28nm GPUs can do.

Pros:silvertachaward.png

+ Modern feature set
+ Extremely low power consumption
+ Aggressive power modulation of GPU and RAM
+ Best video quality currently available
+ HDMI, VGA and DVI interfaces on single slot
+ Cool, silent operation
+ Truly awesome looks
+ Very low heat generation

Cons:

- High-end gaming titles are almost impossible to play
- AIB partners will probably mess with the good looks

Ratings:

  • Performance: 8.50
  • Appearance: 9.75
  • Construction: 9.25
  • Functionality: 8.75
  • Value: 8.50

Final Score: 8.95 out of 10.

Quality Recognition: Benchmark Reviews Silver Tachometer Award

Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.


Related Articles:
 

Comments 

 
# CompitableBhavesh Patel 2011-10-20 20:09
is ATI Radeon HD5450 compitable with intel dual core 2.8 Ghz processor ?
Please reply as early as possible
Report Comment
 
 
# RE: CompitableOlin Coles 2011-10-20 21:17
Yes. All video cards are compatible with all processors. Just make sure the motherboard has a PCI-Express graphics slot.
Report Comment
 
 
# RE: ATI Radeon HD5450 HTPC Video Cardmahmod 2012-03-08 07:50
how to make sure the mother board has a pci-slot?
Report Comment
 
 
# Open up the box and look inside.Bruce 2012-03-08 08:29
Wikipedia has some good pictures on the PCI Express page that show what they look like
Report Comment
 
 
# RE: ATI Radeon HD5450 HTPC Video CardHanzyy 2012-04-24 07:43
Is is good foor games?Will it play them fast? I don't know so much about hardware and stuff...
Report Comment
 
 
# Not so muchBruce 2012-04-24 07:50
This card is on the low end of discrete graphics.
It's really better suited to a Home Theater PC.
Report Comment
 
 
# Fan connectorRoger 2012-06-01 17:01
I see there is what looks like a fan connector on the card - is this correct and if so what would be the power specs for a fan?
Report Comment
 
 
# YesBruce 2012-06-02 11:12
If you look at some of the other HD 5450 cards out on the market (Newegg has 34 models for sale, 2+ years after launch...!) you wll see that many of them have active cooling, with a fan plugged into this very same header. It's only two pins, so don't expect PWM or anything, just straight DC with no speed monitoring.
Report Comment
 
 
# RE: Fan connectorTaimur 2013-02-05 10:52
The PCI slot has to have the little gap thing, the oppisite way around that does the graphics card have so for instance -- ---- and ---- -- and most of all if the the ports on the graphics card are pointing toward the outside, etc.
Report Comment
 
 
# Re Fan ConnectorRoger 2012-06-02 18:10
Thanks for that Bruce,

Just to clarify such a connected fan would be of similar size and watts etc. to an inbuilt one?

Like I have a spare fan DC 12v 0.11A which is probably ok and another which is DC 12v 0.70A which is probably too powerful?
Report Comment
 
 
# Yes, again....Bruce 2012-06-03 06:58
That's the idea. It's not going to be an exact science, but most fans that are small enough and thin enough to fit on that card's heatsink will be fine. I've been assuming that you want to cool the GPU on the HD 5450 with this fan... right?
Report Comment
 
 
# Thats right Bruce...Roger 2012-06-03 22:01
I figure one cannot have enough fans (especially where I live in the tropics) so I might as well use a spare fan to play on the heatsink. The card will be on the bottom of the mobo with the heat sink facing down so I will mount the fan on the case bottom facing up to the heatsink.

Thanks again
Report Comment
 
 
# RE: ATI Radeon HD5450 HTPC Video Cardjex2013 2013-03-05 18:43
I badly needed an htpc graphic card for my htpc. I am really having a hard time choosing because i really don't have any idea with these kind of things. But anyways, you're post is quite helpful. Thanks!
Report Comment
 
 
# REPLY PLZZZAMINLRB 2013-03-08 13:33
IS THIS CARD GOOD FOR HIGH RESOLUTIO GAMED
REPLY SOOOOON PLEASE
Report Comment
 
 
# No, It's NotBruce 2013-03-08 14:02
It is not good for high resolution games.
It was designed for HTPC use, which is much less demanding.
Plus, it was released three years ago, that's a LONG time in video card history. Why are you interested in it now? Can you even buy one in your location?
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews Archive