Archive Home arrow Reviews: arrow Video Cards arrow PowerColor PCS+ HD6950 Vortex II
PowerColor PCS+ HD6950 Vortex II
Reviews - Featured Reviews: Video Cards
Written by David Ramsey   
Monday, 23 May 2011

PowerColor PCS+ Radeon HD6950 Vortex II

Manufacturer: PowerColor (TUL Corporation)
Product Name: PCS+ HD6950 2GB GDDR5
Model Number: AX6950 2GBD5-PPV
Price As Tested:$315.00 (Call of Duty Edition) at NewEgg

Full Disclosure: The product sample used in this article has been provided by PowerColor.

It's a predictable progression: NVIDIA or AMD release a new GPU, along with a "reference design" video card built around it. All of the marketing partners introduce new video cards that are the reference design card with a vendor label or graphic affixed to it. And while some vendors leave it at that, others aim for the enthusiast market by designing their own video cards around the new GPUs, adding their own features and capabilities. PowerColor is one of the latter, and they have several variants of the AMD Radeon HD6950 video card ranging from plain reference designs to, well, this one: the PowerColor PCS+ Radeon HD6950 Vortex II Edition. It's a mouthful of a product name, to be sure, and Benchmark Reviews puts it to the test in this review.

PowerColor offers seven variations on the Radeon HD6950 theme, ranging from a reference design card with the standard 2GB of memory to a "value" card with only 1GB of memory to the fancier "PCS+" and "PCS++" versions. Aside from the reference card, all versions have two fans, although the heat sink and printed circuit board under the fans varies.

What can a vendor add to a reference design? The enhancement most commonly seen is a better cooler, although these days the reference coolers on higher-end cards like the 6950 are pretty good. Next would be higher-quality power supply designs to improve stability and overclocking. Last on the list would be major feature changes like extra video outputs or wireless transmission. With the Vortex II Edition, PowerColor's taken the first two options: an enhanced cooler combined with a completely new PCB design and beefed-up power supply. Along with the hardware enhancements, the card is delivered with a mild overclock, with the GPU running at 850MHz (as opposed to 800MHz for the reference 6950) and the memory at 1300Mhz instead of the standard 1250MHz.

powercolor_radeon_6950_box.jpg

With prices starting at under $250, the Radeon 6950 is a strong player in the mid-range video card market. Its competitors on the other side of the aisle are NVIDIA's GTX560Ti and GT570, and Benchmark Reviews will be testing this card against those and others to see how it competes.

Closer Look: PowerColor PCS+ HD6950

The PCS+ HD6950 Vortex II edition comes with a quick installation guide, a CD with drivers and copies of the manual in several languages, a CrossFireX bridge, a DVI-VGA adapter, and a mini-DisplayPort to DisplayPort cable. The reference design Radeon HD6950 is a fairly boring-looking card: a large black rectangle whose only concession to visual interest is a red axial fan. The PowerColor PCS+ Radeon HD6950 Vortex II Edition, on the other hand, looks like a prop from a science fiction movie.

powercolor_radeon_6950_box_contents.jpg

With a bright red PCB, a black and red cooler, and four shiny heat pipes erupting from the top of the card, it certainly stands out visually. It's really too bad that much of this won't be visible even in a windowed case! The cooler uses four 8mm heat pipes to distribute the GPU's thermal load.

powercolor_radeon_6950_rear.jpg

The back of the card offers a clearer view of the four heat pipes. Until recently, cards with 2GB of memory would have memory chips mounted to the back of the board, but with modern high density memory, this isn't necessary.

powercolor_radeon_6950_heatpipes.jpg

The heat pipes branch out into an array of aluminum fins that cover the length of the card. Note the copper heat plate that contacts the GPU die. There is no direct contact with the memory chips, although the ones near the back of the card are directly under a fan. The memory chips at the top of the card are above the copper contact plate and won't receive much air.

powercolor_radeon_6950_heatsink_back2.jpg

From the front of the cooling mechanism we can see the two 92mm fans that provide the cooling air. This is a common design for enhanced video card coolers, and while the dual fans provide a lot of airflow, they also blow a lot of hot air inside your case. You might be wondering what those extended triangular red fins on the fan shrouds are for...we'll get to that later.

powercolor_radeon_6950_heatsink_front.jpg

The video outputs are the standard Radeon HD6950 fare: two DVI ports, a standard HDMI 1.4 connector, and two mini-DisplayPort connectors. The top DVI connector in this image is a single-link connector, something to be aware of should you be plugging this card into a 30" monitor. By using both mini-DisplayPort connectors, the HDMI connector, and one of the DVI connectors, you can run four monitors directly from this card.

powercolor_radeon_6950_ports.jpg

Now let's take a look at some of the components on this card.

PCS+ HD6950 Detailed Features

With the cooler removed, the bright red PCB is exposed. This is definitely not a reference design! In this image you can also see the light brown plastic port covers PowerColor provides for the rear ports as well as the CrossFireX connectors at the top of the card. Two 6-pin PCI-E connectors provide the extra power this card needs.

powercolor_radeon_6950_front_no_cooler.jpg

The Cayman-class GPU is used in both the Radeon 6950 and Radeon 6970 video cards. Like many OEMs, PowerColor applies far too much thermal interface material, which slopped over the edges of the GPU die when the heat sink was attached. Although the extra TIM doesn't hurt anything, it can result in less than optimum thermal performance. While the temperatures I report later in this review were measured before I removed the cooler for these pictures, maximum temperatures under load dropped 2-3 degrees after I reinstalled the cooler with a careful application of much less TIM. Benchmark Reviews has an excellent guide on the proper use and application of thermal interface material here.

powercolor_radeon_6950_GPU.jpg

The Radeon PCS+ 6950 Vortex II Edition comes equipped with 2GB of Hynix H5GQ2H24MFR-T2C GDDR5 video memory. This is the same memory AMD uses on their reference design board and is rated for 1250MHz. However, as delivered, the card runs this memory at 1300MHz.

powercolor_radeon_6950_memory.jpg

The power supply section of the card is where it differs most dramatically from AMD's reference design. There are six power phases for the GPU and one for the memory, and as you can see there's space for an unused phase. Adding two-phase power for the memory probably wouldn't help anything, though, but the extra phases for the GPU should help overclocking.

powercolor_radeon_6950_power_supply.jpg

All that power is managed by this CHIL 8228G, which provides the 8-phase power (7 used) for the card as well as standard voltage adjustment and monitoring via the I2C communications protocol. This standard protocol is what enables utilities like MSI Afterburner to tweak and report on the voltage you set on your card.

powercolor_radeon_6950_chil.jpg

PowerColor calls this card the "Vortex II Edition", and the term refers to the adjustable fans used on the cooler. The fans protrude more than you'd expect. In fact, this is really a 3-slot card, as you can see from the image below: the fans intrude on the space for the second slot over from the video card. If your motherboard's first two PCI-E x16 slots are separated only by one other slot, you're not going to be able to run a CrossFireX setup with this card.

powercolor_radeon_6950_fans_normal.jpg

The fans can be shifted out in their mounts by grasping the triangular protrusions and pulling out; you can even tilt the fans slightly so the airflow is aimed more at either end of the card. PowerColor claims the extra space between the fans and the cooling fins, combined with the more directional airflow, will improve cooling; but I wasn't able to measure any improvement with the fans in either position. A 3-slot cooler might be worth it if the extra space required by the pop-out fan feature resulted in better cooling, but it doesn't. As far as I can tell, the adjustability of these fans is mentioned only on the rear of the box, not in the installation guide or manual.

powercolor_radeon_6950_fans_extended.jpg

Let's review the detailed technical specifications of this card in the next section.

Radeon PCS+ 6950 Vortex II Features

AMD's new generation GPUs were launched with the Radeon 5000 series cards in September, 2009, and were the first to feature DirectX 11 support. AMD ruled the video card performance roost until NVIDIA's introduction of the GTX480, and the two companies have continued trading marketing and technological blows since then. It's always interesting to look at the performance of the very top-end video cards, but the truth is that unless you're running a multi-monitor systems at insane resolutions, you can get more than enough performance from most mid-range cards...and AMD and NVIDIA probably sell dozens of cards in this class for every HD6990 and GTX580. Here's the complete feature and specification list from PowerColor:

  • Up to 850MHz Engine Clock
  • 2GB GDDR5 Memory
  • 1300MHz Memory Clock (5.2 Gbps GDDR5)
  • 160 GB/s memory bandwidth (maximum)
  • 2.7/ 2.25 TFLOPs Single Precision compute power
  • 562.5 GFLOPs Double Precision compute power
  • Double slot form factor
  • TeraScale 3 Unified Processing Architecture
    • 1408 Stream Processors
    • 88 Texture Units
    • 128 Z/Stencil ROP Units
    • 32 Color ROP Units
  • High Speed 256-bit GDDR5 memory interface
  • PCI Express 2.1 x16 bus interface
  • "Eye-Definition" graphics
    • New and advanced architecture
      • Full DirectX 11 support
      • Scalable geometry processing technology
      • Shader Model 5.0
      • DirectCompute 11
      • Dual advanced programmable hardware tessellation units
      • Accelerated multi-threading
      • HDR texture compression
      • Order-independent transparency
    • OpenGL 4.1 support
    • Image quality enhancement technology
      • Up to 24x multi-sample and super-sample anti-aliasing modes
      • Adaptive anti-aliasing
      • Enhanced Quality Anti-Aliasing (EQAA)
      • Morphological Anti-Aliasing (MLAA)
      • 16x angle independent anisotropic texture filtering
      • 128-bit floating point HDR rendering
  • AMD Eyefinity multi-display technology
    • Native support for up to 4 simultaneous displays
      • Up to 6 displays supported with DisplayPort 1.2 Multi-Stream Transport
    • Independent resolutions, refresh rates, color controls, and video overlays
    • Display grouping
      • Combine multiple displays to behave like a single large display
  • AMD EyeSpeed visual acceleration
    • AMD Accelerated Parallel Processing (APP) technology
      • OpenCL 1.1 Support
      • DirectCompute 11
      • Double Precision Floating Point
      • Accelerated video encoding, transcoding, and upscaling
    • UVD 3 dedicated video playback accelerator
      • MPEG-4 AVC/H.264
      • VC-1
      • MPEG-2 (SD & HD)
      • Multi-View Codec (MVC)
      • MPEG-4 part 2 (DivX, Xvid)
      • Adobe Flash
    • Enhanced Video Quality features
      • Advanced post-processing and scaling
      • Dynamic contrast enhancement and color correction
      • Brighter whites processing (Blue Stretch)
      • Independent video gamma control
      • Dynamic video range control
    • Dual-stream HD (1080p) playback support
    • DXVA 1.0 & 2.0 support
  • AMD HD3D technology
    • Stereoscopic 3D display/glasses support
    • Blu-ray 3D support
    • Stereoscopic 3D gaming
    • 3rd party Stereoscopic 3D middleware software support
  • AMD CrossFireXT multi-GPU technology
    • Dual, triple or quad-GPU scaling
  • Cutting-edge integrated display support
    • DisplayPort 1.2
      • Max resolution: 2560x1600 per display
      • Multi-Stream Transport
      • 21.6 Gbps bandwidth
      • High bit-rate audio
    • HDMI 1.4a with Stereoscopic 3D Frame Packing Format, Deep Color, xvYCC wide gamut support, and high bit-rate audio
      • Max resolution: 1920x1200
    • Dual-link DVI with HDCP
      • Max resolution: 2560x1600
    • VGA
      • Max resolution: 2048x1536
  • Integrated HD audio controller
    • Output protected high bit rate 7.1 channel surround sound over HDMI with no additional cables required
    • Supports AC-3, AAC, Dolby TrueHD and DTS Master Audio formats
  • AMD PowerPlay power management technology
    • AMD PowerTune technology
      • Intelligent power management hardware
    • Dynamic power management with low power idle state
      • Ultra-low power state support for multi-GPU configurations
  • AMD Catalyst graphics and HD video configuration software
    • Software support for Windows 7, Windows Vista, and Windows XP
    • AMD Catalyst Control Center - AMD Catalyst software application and user interface for setup, configuration, and accessing features of AMD Radeon products.
    • Unified Graphics display driver - AMD Catalyst software enabling other PC programs and devices to use advanced graphics, video, and features of AMD Radeon products.

HD6950 to HD6970: Unlocking Extra Shaders

When the Radeon HD6950 cards were introduced, it didn't take enthusiasts long to figure out that they used the exact same GPU as the HD6970, only with some of the shaders disabled. While an HD6970 has 1536 shaders, the HD6950 has only 1408. By re-flashing the BIOS on a 6950 with the BIOS from a 6970, or using a tool such as TechPowerUp's "Radeon BIOS Editor", enthusiasts could enable the unused shaders and gain some performance. Most 6950 cards made it really easy with this little switch:

powercolor_radeon_6950_switch.jpg

In the position shown, the card uses its standard, non-modifiable BIOS; moving it to the other position selects a second, modifiable BIOS. This removed worries about "bricking" a card with a failed BIOS update since you can always simply flip the switch to the original position and reboot.

However, I wasn't able to unlock the extra shaders on this card. I tried flashing with several 6970 BIOSes as well as modifying the original BIOS with TechPowerUp's "Radeon BIOS Editor" utility. I could change parameters in the BIOS such as the default GPU and memory clocks, and confirm the changes with GPU-Z, but shader unlocking never worked. I suspect PowerColor (or perhaps AMD) has locked this ability out. PowerColor does offer a 6950-based card, the AX6950 PCS++, which comes with a 1536-shader enabled BIOS pre-installed on the card and accessible via the switch, but that card comes with 8-pin and 6-pin power connectors to handle the extra load, whereas the Vortex II Edition uses two 6-pin connectors. Benchmark Reviews examined the AX6950 PCS++ card here.

Let's see how this card performs just as a 6950.

Video Card Testing Methodology

There are still a disturbing number of people out there using Windows XP, but here at Benchmark Reviews we switched our tests to Windows 7 a while back. It's as fast as XP in most tests and offers substantially improved security and features, but its best feature is its robust 64 bit architecture (there is a 32-bit version of Windows 7, of course, but you'd be well advised to use the 64-bit version). It's taken Microsoft a few years to get 64 bit support working well, but it's finally here. If you're running Windows XP on a computer with 4GB of memory, you probably have already noticed that you don't have all 4GB available. Plug in a PowerColor Radeon HD6950 Vortex II Edition video card and you'll see another 2GB of working memory vanish, since the 2GB of memory on the video card has to slot into the 4GB address space of your old 32 bit operating system.

Needless to say, you couldn't run a CrossFireX setup with this card on Windows XP at all, since your entire address space would be taken up by the video card memory.

According to the most recent Steam gaming survey, the two most popular resolutions are 1920x1080 and 1680x1050. My personal monitor's native resolution is 1920x1200, which was a very popular resolution until manufacturers decided for some reason that a computer monitor should have the same resolution as 1080p HD televisions. Consequently I run my benchmarks at 1920x1200, which will produce very slightly lower frame rates than 1920x1080.

I choose a combination of synthetic and game benchmarks for this test. The test platform was an ASUS P8Z68-V Pro motherboard and an Intel Core i7-2600K processor at stock clock speeds; the iGPU in the Sandy Bridge processor was completely disabled. As always, remember that these test results are specific to the system and software used in this review, and that different hardware, drivers, and benchmark versions will affect the results.

Intel Z68 Test System

  • Motherboard: ASUS P8Z68-V Pro (BIOS 8801)
  • System Memory: 2x2GB G.SKILL DDR3-1333 9-9-9-24
  • Processor: Intel Core i7 2600K
  • Hard Drive: Western Digital Raptor 300 WD30 00HLFS-01G6U1
  • Operating System: Windows 7 Home Premium x64 with SP1

Benchmark Software

  • AMD Catalyst 11.5
  • NVIDIA ForceWare 270.61
  • DX10: 3DMark Vantage
  • DX10: Crysis Warhead
  • DX11: Aliens vs. Predator
  • DX11: 3DMark11
  • DX11: Unigine Heaven 2.5
  • DX11: Battlefield: Bad Company 2

Video Card Test Products

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX10: 3DMark Vantage

Every few years, FutureMark updates their video card benchmark suites, but the older versions remain relevant for years after they've been superseded. 3DMark Vantage is a good example: it's an excellent test for DX10 support, and still can strain the burliest graphics cards with its two GPU-specific tests. Benchmark Reviews runs the Jane Nash and New Calico tests to stress our test cards. For this review I set the 3DMark Vantage settings to high quality, with 8x anti-aliasing and 16x anisotropic filtering.

Of the two GPU tests 3DMark Vantage offers, the Jane Nash performance benchmark is slightly less demanding. In a short video scene the special agent escapes a secret lair by water, nearly losing her shirt in the process. Let's look at the results:

3dmark_vantage_nash.png

The Radeon 5770 was a good card in its day (and still can be in a CrossFireX setup), but it's outclassed here. The 6950 stock results eclipse those of every other card here except the NVIDIA GTX570, which is 0.5 to 1.4 fps faster. Even then, the overclocked 6950 is the fastest card in both tests.

The New Calico test is more intensive, with a giant carrier spaceship sending a fleet of smaller bombers through a tumbling asteroid field with hundreds of spinning space rocks. The camera swoops among the asteroids as it follows the bombers through the field for a clear shot of the doomed planet below. Changing light sources and lens flares, as well as a fly-through of the carrier, add to the complexity of the test.

3dmark_vantage_calico.png

The GTX570 takes the win in the New Calico test, followed closely by the overclocked 6950, the GTX560, and the stock-clocked 6950. The 5770 and 6850 trail far behind.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX10: Crysis Warhead

Crysis Warhead is an expansion pack based on the original Crysis video game. Crysis Warhead is based in the future, where an ancient alien spacecraft has been discovered beneath the Earth on an island east of the Philippines. Crysis Warhead uses a refined version of the CryENGINE2 graphics engine. Like Crysis, Warhead uses the Microsoft Direct3D 10 (DirectX-10) API for graphics rendering. In this test I use the 1.1 Warhead patch and its accompanying 64-bit game engine.

Benchmark Reviews uses the HOC Crysis Warhead benchmark tool to test and measure graphic performance using the Airfield 1 demo scene. This short test places a high amount of stress on a graphics card because of detailed terrain and textures, but also for the test settings used. Using the DirectX-10 test with High Quality settings, the Airfield 1 demo scene receives 4x anti-aliasing and 16x anisotropic filtering to create a heavy graphic load and separate the products according to their performance.

Only the most powerful graphics cards are expected to perform well in our Crysis Warhead benchmark tests. DirectX-11 extensions are not supported in Crysis: Warhead, and SSAO is not an available option.

crysis_warhead.png

It's impressive to see that we now have sub-$200 graphics cards that can return a playable frame rate in Crysis Warhead at 1920x1200. Crysis has always favored NVIDIA cards, and here the GTX560 equals or exceeds even the overclocked Radeon 6950.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX11: 3DMark 11

FutureMark 3DMark11 is the latest addition the 3DMark benchmark series built by FutureMark Corporation. 3DMark11is a PC benchmark suite designed to test the DirectX-11 graphics card performance without vendor preference. Although 3DMark11 includes the unbiased Bullet Open Source Physics Library instead of NVIDIA PhysX for the CPU/Physics tests, Benchmark Reviews concentrates on the four graphics-only tests in 3DMark11 and uses them with medium-level 'Performance' presets.

The 'Performance' level setting applies 1x multi-sample anti-aliasing and trilinear texture filtering to a 1280x720p resolution. The tessellation detail, when called upon by a test, is preset to level 5, with a maximum tessellation factor of 10. The shadow map size is limited to 5 and the shadow cascade count is set to 4, while the surface shadow sample count is at the maximum value of 16. Ambient occlusion is enabled, and preset to a quality level of 5.

3DMark 11's four graphics tests take the user through two underwater and two jungle scenarios. The underwater scenes (GT1 and GT2) do not use tessellation, but the jungle scenes (GT3 and GT4) do.

3dmark11.png

The AMD Radeons jump back into the lead on this test, and the overall ranking of these cards is similar to what we saw in the 3DMark Vantage tests. The Radeon HD6850 comes very close to the scores of the GTX 560, while the HD6590 dominates even when not overclocked.

3dmark11_2.png

The card rankings remain the same as in the first two tests, with the PowerColor Radeon PCS+ HD6950 continuing to dominate.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX11: Aliens vs. Predator

Aliens vs. Predator is a science fiction first-person shooter video game, developed by Rebellion, and published by Sega for Microsoft Windows, Sony PlayStation 3, and Microsoft Xbox 360. Aliens vs. Predator utilizes Rebellion's proprietary Asura game engine, which had previously found its way into Call of Duty: World at War and Rogue Warrior. The self-contained benchmark tool is used for our DirectX-11 tests, which push the Asura game engine to its limit.

In our benchmark tests, Aliens vs. Predator was configured to use high quality settings with 4x AA and 16x AF, along with tessellation, advanced shadows, and Screen Space Ambient Occlusion (SSAO).

alienspredator.png

This is a fairly tough test, tougher even than Crysis Warhead, judging from the frame rates. Of course Warhead, being a DX10 game, doesn't have tessellation or SSAO. The HD6950 continue to dominate here, with even the non-overclocked configuration beating the scores of the GTX570.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX11: Unigine Heaven 2.5

The Unigine Heaven 2.5 benchmark is a free publicly available tool that beautifully demonstrates the DirectX 11 graphics capabilities in Windows 7 or updated Vista Operating Systems. Set among an enchanting village built on floating islands connected by swaying rope bridges, Heaven uses advanced lighting, tessellation, smoke and particle effects, distance blur, and a host of other graphical techniques to render a photo-realistic experience. An interactive mode allows for manual exploration of the village. Of course, all of this requires considerable graphics muscle, and I've found Heaven to be an excellent tool for testing the stability of graphics card overclocks.

The distinguishing feature in the Unigine Heaven benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces, so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality looks real. The "Heaven" benchmark excels at providing the following key features:

  • Native support of OpenGL, DirectX 9, DirectX-10 and DirectX-11
  • Comprehensive use of tessellation technology
  • Advanced SSAO (screen-space ambient occlusion)
  • Volumetric cumulonimbus clouds generated by a physically accurate algorithm
  • Dynamic simulation of changing environment with high physical fidelity
  • Interactive experience with fly/walk-through modes
  • ATI Eyefinity support

heaven.png

NVIDIA pulls back into the lead here, with the GTX560 just matching the stock-clocked HD6950, and the GTX570 posting scores about 12-13% better than the overclocked HD6950.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

DX11: Battlefield Bad Company 2

The Battlefield franchise has been known to demand a lot from PC graphics hardware. DICE (Digital Illusions CE) has incorporated their Frostbite-1.5 game engine with Destruction-2.0 feature set with Battlefield: Bad Company 2. Battlefield: Bad Company 2 features destructible environments using Frostbit Destruction-2.0, and adds gravitational bullet drop effects for projectiles shot from weapons at a long distance. The Frostbite-1.5 game engine used on this game consists of DirectX 10 primary graphics, with improved performance and softened dynamic shadows added for DirectX 11 users.

At the time Battlefield: Bad Company 2 was published, DICE was also working on the Frostbite-2.0 game engine. This upcoming engine will include native support for DirectX 10.1 and DirectX 11, as well as parallelized processing support for 2-8 parallel threads. This will improve performance for users with an Intel Core-i7 processor. Unfortunately, the Extreme Edition Intel Core i7-980X six-core CPU with twelve threads will not see full utilization.

In our benchmark tests of Battlefield: Bad Company 2, the first three minutes of action in the single-player raft night scene are captured with FRAPS. Relative to the online multiplayer action, these frame rate results are nearly identical to daytime maps with the same video settings. The Frostbite-1.5 game engine in Battlefield: Bad Company 2 appears to equalize our test set of video cards, and despite AMD's sponsorship of the game it still plays well using any brand of graphics card.

bad_company.png

At 1920x1200, the 6950, 5870, and GTX560Ti all come within a few percentage points of each other. The GTX570 dominates with scores 10fps better than the overclocked 6950 at 1920x1280, and 15fps better at 1680x1050.

Graphics Card Radeon HD5770 Radeon HD6850 Radeon HD6950 Radeon HD6950 OC Radeon HD5870 NVIDIA GTX560Ti NVIDIA GTX570
GPU Cores 800 960 1408 1408 1600 384 480
Core Clock (MHz) 850 775 850 930 875 822 732
Shader Clock (MHz) N/A N/A N/A N/A N/A 1645 1464
Memory Clock (MHz) 1200 1000 1300 1370 1250 1050 1900
Memory Amount 1024MB GDDR5 1024MB GDDR5 2048MB GDDR5 2048MB GDDR5 1024MB GDDR5 1024MB GDDR5 1280MB GDDR5
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 256-bit 320-bit

PCS+ HD6950 Temperatures

Adding an enhanced cooling solution is the first change any vendor will make to a reference video card. By their nature, reference coolers are designed to keep the GPU at a temperature that NVIDIA and AMD deem safe at reasonable ambient temperatures. With noise and cost as major considerations for the stock cooler, these "safe" temperatures often leave little headroom for overclocking.

FurMark is the application to use if you want to drive video card temperatures as high as possible. Still, this is complicated by the extra controls AMD's built into its latest generation GPUs and drivers, which will automatically throttle themselves down if power draw or temperature exceed certain limits. In this respect they're similar to Intel's Sandy Bridge processors, which will also aggressively throttle themselves to prevent damage.

AMD Overdrive, an overclocking feature built into AMD's Catalyst Control Center, has a "Power Control Settings" slider you can adjust to allow up to 20% extra power draw by the GPU, and I adjusted this control to the maximum 20% for temperature, power, and overclocking. I let FurMark run until the reported temperature was stable for 5 minutes, and recorded the load temperatures with the fans on "Auto" as well as 100% speeds.

Video Card Ambient Idle Load Load (100% fans)
PowerColor Radeon PCS+ HD6950 20°C 34°C 67°C 54°C

When I removed the cooler from the card for photography (after this test), I was concerned by two things: the mounting system didn't seem to apply much pressure to clamp the heat sink to the GPU, and the excessive amount of thermal paste. But obviously neither of these factors mattered much, because this cooler keeps the card really, really cool. These temperatures are 1°C higher for load and 3° cooler under load with 100% fans than the similar cooler on the PowerColor PCS++ HD6950 card Benchmark Reviews tested previously.

The automatic fan control kept the fans at about 28-30% speed under load, at which speed they were almost inaudible. Manually ramping the fans up to full speed decreases temperatures dramatically, but at a significant sonic cost. I don't see any reason to take the fans off auto control when running at stock speeds, since the card simply doesn't get that hot, even under heavy loads.

The overclocked temperatures I recorded with FurMark were identical to the stock temperatures, as were the frame rates reported by the utility. This is because AMD's drivers detect when certain benchmarking applications like FurMark are being used, and throttle the card accordingly. I did see load temperatures up to 74°C (auto fans) during some gaming benchmarks, so FurMark's definitely being throttled here. With older versions of AMD's drivers, simply renaming FurMark was enough to defeat this protection, but that doesn't work with the current drivers from either ATI or NVIDIA, which will limit FurMark's continued use as a stress testing tool. AMD and NVIDIA both claim that utilities like FurMark aren't realistic in that no real-world use will ever apply the continuous loads these test utilities do, and that the detection of these utilities prevents possible damage to their video cards.

VGA Power Consumption

Your video card under load can easily consume more power than the rest of your system put together (depending of course on your CPU and how hard it's working). While even the Mighty Intel Core i7 980X pulls a maximum of 130 watts, a mid-range card like the PowerColor PCS+ Radeon HD6950 Vortex II Edition can easily use more than that, which is why AMD recommends a minimum of a 500 watt power supply when using just one of these cards. The days when you could build an enthusiast system with a 350 watt power supply are long gone.

But your computer probably spends a lot of time at less than maximum loads, and manufacturers are getting better at creating CPUs and GPUs that can dramatically reduce their power consumption at low loads or idle. Granted, giant dual-GPU cards like the Radeon 6990 can slurp up hundreds of watts of power, but make up about as large a percentage of the video card market as Hummer H1s do of the automotive market.

To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4460) power meter made by P3 International.

A baseline test is taken without a video card installed inside our test computer system, which is allowed to boot into Windows 7 and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product.

Video Card Power Consumption by Benchmark Reviews

VGA Product Description

(sorted by combined total power)

Idle Power

Loaded Power

NVIDIA GeForce GTX 480 SLI Set
82 W
655 W
NVIDIA GeForce GTX 590 Reference Design
53 W
396 W
ATI Radeon HD 4870 X2 Reference Design
100 W
320 W
AMD Radeon HD 6990 Reference Design
46 W
350 W
NVIDIA GeForce GTX 295 Reference Design
74 W
302 W
ASUS GeForce GTX 480 Reference Design
39 W
315 W
ATI Radeon HD 5970 Reference Design
48 W
299 W
NVIDIA GeForce GTX 690 Reference Design
25 W
321 W
ATI Radeon HD 4850 CrossFireX Set
123 W
210 W
ATI Radeon HD 4890 Reference Design
65 W
268 W
AMD Radeon HD 7970 Reference Design
21 W
311 W
NVIDIA GeForce GTX 470 Reference Design
42 W
278 W
NVIDIA GeForce GTX 580 Reference Design
31 W
246 W
NVIDIA GeForce GTX 570 Reference Design
31 W
241 W
ATI Radeon HD 5870 Reference Design
25 W
240 W
ATI Radeon HD 6970 Reference Design
24 W
233 W
NVIDIA GeForce GTX 465 Reference Design
36 W
219 W
NVIDIA GeForce GTX 680 Reference Design
14 W
243 W
Sapphire Radeon HD 4850 X2 11139-00-40R
73 W
180 W
NVIDIA GeForce 9800 GX2 Reference Design
85 W
186 W
NVIDIA GeForce GTX 780 Reference Design
10 W
275 W
NVIDIA GeForce GTX 770 Reference Design
9 W
256 W
NVIDIA GeForce GTX 280 Reference Design
35 W
225 W
NVIDIA GeForce GTX 260 (216) Reference Design
42 W
203 W
ATI Radeon HD 4870 Reference Design
58 W
166 W
NVIDIA GeForce GTX 560 Ti Reference Design
17 W
199 W
NVIDIA GeForce GTX 460 Reference Design
18 W
167 W
AMD Radeon HD 6870 Reference Design
20 W
162 W
NVIDIA GeForce GTX 670 Reference Design
14 W
167 W
ATI Radeon HD 5850 Reference Design
24 W
157 W
NVIDIA GeForce GTX 650 Ti BOOST Reference Design
8 W
164 W
AMD Radeon HD 6850 Reference Design
20 W
139 W
NVIDIA GeForce 8800 GT Reference Design
31 W
133 W
ATI Radeon HD 4770 RV740 GDDR5 Reference Design
37 W
120 W
ATI Radeon HD 5770 Reference Design
16 W
122 W
NVIDIA GeForce GTS 450 Reference Design
22 W
115 W
NVIDIA GeForce GTX 650 Ti Reference Design
12 W
112 W
ATI Radeon HD 4670 Reference Design
9 W
70 W
* Results are accurate to within +/- 5W.

The Z68 test system idled at 51 watts with no video card installed. The PowerColor PCS+ Radeon HD6950 Vortex II Edition card pulled 26 watts at idle (77-51) and 184 watts under FurMark load (235-51) for a combined total of 210 watts. This slots in just under the NVIDIA GTX560Ti reference design, which posts lower idle power consumption but higher load power consumption. These are very good results for performance at this level.

Vortex II Edition Overclocking

With its enhanced power circuitry and excellent cooling, one would expect the PowerColor's Radeon HD6950 video card to overclock very well...and so it does. But please remember that nothing's ever guaranted with overclocking. As the saying goes, your mileage may vary. Also remember that external factors, such as your system's power supply, can affect overclocking results as well.

With these caveats out of the way, let's look at our results:

powercolor_radeon_6950_overclocked.jpg

With 930MHz core clocks and 1370MHz memory clocks, we're a substantial 16% faster on the GPU and 10% faster on the memory clocks than a reference HD6950 (but remember this card comes pre-overclocked from the factory). These might not seem like much compared to the 40+% overclocks routinely seen on Sandy Bridge "K"-series CPUs, but they're excellent in graphics card terms, especially with only air cooling. I did have to keep the fans pegged at 100% speed to successfully complete benchmarking at these speeds, though.

Here's a breakdown of the performance boost provided by this overclock at 1920x1200. Results are rounded to the nearest 0.1 FPS.

Benchmark Stock FPS OC FPS % improvement
Vantage Jane Nash 33.5 36.4 8.7
Vantage New Calico 24.9 27.6 10.8
Crysis Warhead 44 45 2.3
3DMark11 GT1 20.4 22.3 9.3
3DMark11 GT2 24.6 26.7 8.5
3DMark11 GT3 29.4 31.8 8.2
3DMark11 GT4 14.6 16.0 9.6
Aliens vs. Predator 38.3 41.4 8.0
Unigine Heaven 30.1 32.6 9.0
Bad Company 2 69.9 74.8 7.0
Average Improvement 8.1

The 8.1% average improvement in performance closely matches the 9.4% increase in GPU clock speed over the 850MHz speed it normally runs at.

AMD Radeon HD6950 Final Thoughts

AMD's card naming scheme has confused the marketplace: the Barts-GPU 68xx cards actually provided slightly less performance than the older 58xx series cards, although at a similar or better price-performance ration. The Cayman GPUs used in the 69xx series, though, are AMD's best performing GPUs, populating various iterations of the Radeon HD6950 and HD6970 cards, as well as the monster dual-GPU HD6990.

The said, the performance increment of the new GPUs is still a little disappointing. The two-year-old Radeon HD5870 provides pretty much the same performance as the HD6950, which wins only in a few benchmarks where its 2GB frame buffer can be put to use. However, where it's still available, the HD5870 costs anywhere from $50 to $150 more than the HD6950, so the win here is on price-performance rather than sheer graphics muscle.

powercolor_radeon_6950_34.jpg

The HD6950's obvious competitors are NVIDIA's GTX560Ti and GTX570. The HD6950 equals or exceeds the performance of the GTX560Ti in all of the benchmarks, but beats the GTX570 in only 5 out of 10. Newegg's prices on the GTX560Ti vary from $250 to $280, and their prices on the GTX570 range from $350 to $380. Radeon HD6950 2GB prices start at $260.00, and while I wasn't able to find the "bare" PowerColor PCS+ Radeon HD6950 Vortex II Edition card, the "Call of Duty" version (which includes the game with the card) is available for $315.00. Even this price competes very well with the NVIDIA cards, though.

AMD and NVIDIA's main differences these days come down to how they handle multi-GPU setups, and NVIDIA's PhysX feature, which is used in an increasing number of games. AMD's CrossFireX setup offers the user more options than NVIDIA's SLI: while you must have identical NVIDIA cards to use SLI, you can mix and match cards from the same "family" (for example, a HD6590 and an HD6790). Also, if you were running a recent AMD system, SLI was not an option since NVIDIA had not licensed the technology for AMD chipsets (this problem also occurs with P67-based motherboards; if you plan to run SLI, make sure and check your motherboard's compatibility first). However, NVIDIA has licensed SLI for AMD's upcoming 900 series chipsets, thus AMD users now have a new option.

I like graphics cards in this price/performance range because they offer enough performance for most applications, and you can easily add another card in CrossFireX or NVIDIA SLI if you need more performance. Right now the Radeon HD6950 offers a price/performance ratio that NVIDIA doesn't have a good match for.

My one real complaint is that PowerColor has too many versions of this card. Admittedly, this is a common problem among vendors, who like to slice the market into $10-$15 segments. With 7 different video cards SKUs based on the Radeon HD6950 chipset, picking the right one isn't always an obvious choice. However, based on my experience with this card, I can safely say it would be hard to go wrong with it.

AX6950 2GBD5-PPV Conclusion

IMPORTANT: Although the rating and final score mentioned in this conclusion are made to be as objective as possible, please be advised that every author perceives these factors differently at various points in time. While we each do our best to ensure that all aspects of the product are considered, there are often times unforeseen market conditions and manufacturer changes which occur after publication that could render our rating obsolete. Please do not base any purchase solely on our conclusion, as it represents our product rating specifically for the product tested, which may differ from future versions. Benchmark Reviews begins our conclusion with a short summary for each of the areas that we rate.

The performance of this card was excellent, especially with the high overclock I was able to achieve. The combination of excellent stock and overclocked performance combined with a quiet yet very effective cooling solution is a real winner. The one disappointment was that I wasn't able to unlock the additional GPU shaders to turn the card into a virtual 6970.

Red and black seem to be the "performance colors" in favor today, and PowerColor's hardly the only company to use them- look at ASUS' "Republic of Gamers" motherboards, for example. The very flashy fan shroud with its protruding, gleaming heat pipes is a design that would be appreciated by every 13-year-old boy who sketched rocket cars in study hall. But even if installed in a windowed case, the video card's garish fan shroud will be pointing down, leaving the heat pipes as the main visual element.

Construction quality on a video card is hard to assess (at least for me). I look for things like excess solder splashes, poor masking, and uniformity of component place, especially on custom PCBs like this one. Everything looks good, even under high magnification, and little touches like the plastic port plugs that keep dust off unused ports and connectors add to the impression of quality. One could argue that the cooler looks a little flashy and cheap, but a check under the plastic shroud reveals a solid and well-designed cooling solution. I do worry about the temperatures on the memory chips directly above the GPU since they seem to lie in a "dead zone" as far as airflow's concerned. PowerColor uses thermal tape to connect these chips to the heat sink on the PCS++ version of this card, and I'm surprised they didn't bother to do so here...how much does 2" of thermal tape cost?

The Radeon HD6950 brings a lot of functionality and performance to the table. The ability to support four monitors, UVD video playback acceleration, and HD3D, among other things, add to this card's appeal (even if AMD's 3D implementation isn't quite as seamless as NVIDIAs). The lack of a PhysX implementation would seem to be mainly a political issue rather than a functional issue, since NVIDIA claims that PhysX is an "open standard" that anyone can implement, and in fact offered to work with ATI (now AMD) on a Radeon implementation back in 2009. Over the past couple of years, PhysX has grown from a "meh" feature used to generate more elaborate explosive debris to a major feature in many games, and is now probably the biggest argument for going with NVIDIA over AMD.

The one functional problem this card has is related to its cooler: although it provides excellent cooling performance, its size makes this a 3-slot card, even with the fans retracted. That's a real disadvantage, especially if you're considering a CrossFireX setup now or in the future. Extending the fans into the "Vortex" configuration doesn't offer improved cooling performance.

Value is a little harder to asses since I wasn't able to find this exact card for sale, but the version packaged with the Call of Duty game is $315.00 at Newegg. This price, while presumably higher than the price of the card without the game, still compares very well with the roughly-equivalent NVIDIA GTX570. This card represents a very good value.

The PowerColor PCS+ Radeon HD6950 Vortex II Edition is an excellent card that would make any single-monitor gamer happy. Just watch out for that slot spacing.

Pros:Quality Recognition: Benchmark Reviews Silver Tachometer Award

+ Excellent cooling performance with low noise
+ Good overclockability
+ Ability to run up to four monitors
+ Excellent game performance
+ Low idle power

Cons:

- Triple-slot cooler
- "Vortex cooling" doesn't offer any cooling advantage
- No PhysX
- Couldn't unlock extra shaders

Ratings:

  • Performance: 9.5
  • Appearance: 8.5
  • Construction: 9.25
  • Functionality: 8.0
  • Value: 9.0

Final Score: 8.85 out of 10.

Quality Recognition: Benchmark Reviews Silver Tachometer Award.

Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.


Related Articles:
 

Comments 

 
# unlockingRealNeil 2011-05-23 08:32
You mentioned in the CONS section that you Couldn't unlock the extra shaders on this card.
Are there still new Radeon HD6950 cards out there that will unlock? I recall reading somewhere that a newer upcoming design would prevent that.
Report Comment
 
 
# RE: PowerColor PCS+ HD6950 Vortex IIDavid Ramsey 2011-05-23 09:07
I'm sure there are, Neil, but I couldn't tell you wish specific ones are known to unlock. Even if a specific model from a specific vendor has unlocked in the past, there's no guarantee current stock will do this.
Report Comment
 
 
# RE: PowerColor PCS+ HD6950 Vortex IIRory 2011-05-24 04:54
Great review thanks, and a small suggestion if I may...
We see a lot of reviews of these aftermarket OC editions on BMR.com (like the powercolor Vortex range and MSi with their Hawk and Lightning variants etc) But they are always pitted against the stock standard cards - or at best older and inferior cards in a crossfire/sli config.
This is useful of course, but the enthusiast in me would love to see a battle royale pitting the best OC variants from Powercolor, MSi, Gigabyte and co. 'Head to Head' to see who makes the best variant of the HD6950 &/or the GTX560ti
Just a thought :)
Report Comment
 
 
# RE: RE: PowerColor PCS+ HD6950 Vortex IIOlin Coles 2011-05-24 08:29
Those stock-clocked products serve as a baseline, since there are numerous overclock models that add 1-10% more speed. It would be very difficult to compare all of the models, which is why we include our own overclock results to show how much of an impact overclocking has on FPS performance.
Report Comment
 
 
# Voltage setting for this OC.Rock_n_Rolla 2011-05-25 02:15
Just want to ask, since this Powercolor 6950 Vortex II has a very nice cooling system capable of cooling the card set on high settings like you did, can we know how many millivolts you added based from its stock voltage? And, How long you tested this clock setting based on the applications you used in order for you to get the percentage???
Also, as you perform your tests, have you experience any hangs or crash?

A friend of mine is serious about buying this PC6850 2gig Vortex 2 card
although he's not into researching much about it on the internet for its real potential, perhaps this might be a good time to ask you so
i could tell him about the settings you used so he could used it on his
when he bought the card. And, if possible if you have tested the card
set to a much higher settings than the one you posted (930mhz and 1370mhz on memory) could you post it here including the voltage settings
and the performance it gained in percentage based from the stock. Thanks and God bless!
Report Comment
 
 
# RE: Voltage setting for this OC.David Ramsey 2011-05-25 09:09
Actually, I didn't have to bump the voltage at all, although (as noted in the review) I did use AMD Overdrive to increase the maximum permitted current draw on the card.

I did not have any hangs or crashes with the overclocked settings I used as long as I kept the fans running full blast. If I left the fans on "Auto", the card would not complete most of the benchmarks when overclocked.

The overclock I achieved was exceptionally high for a 6950. I would not expect most examples to be able to overclock this well. Luck of the draw and all that.

The performance improvement using the overclocked settings is not only noted in each benchmark test, but summarized in a table at the end of the benchmarking section.
Report Comment
 
 
# Screwlaxo 2011-12-13 00:00
hi, do you happen to know what kind/type of screw (the 4 screws that connects the heat sink fan/cooler to the pcb) that this card has? my other one is loose, and may have loose its thread for some reason. the card itself is fine, dont want to RMA just because of one screw acting up, any inputs from anyone will be highly appreciated. thanks in advance.
Report Comment
 
 
# RE: ScrewDavid Ramsey 2011-12-13 07:57
No, sorry. I'd suggest removing one screw and taking it to a hardware or hobby store to see if they have something similar.
Report Comment
 
 
# RE: RE: Screwlaxo 2011-12-13 16:29
Hi David, thanks for the prompt response, will do just that but I bet I'll have a hard time looking for those types. anyways I tried removing one of the screws and guess what, the 2nd one chipped on me, the screw head broke in to two. I know how gentle you need to be on this tiny screws but the build quality is just annoying. you might want to put that in your review, to be careful with the screws, Thanks again.
Report Comment
 
 
# RE: RE: RE: ScrewDavid Ramsey 2011-12-13 16:45
That's annoying as hell...but I was able to remove and replace the cooler on my card without any problems...
Report Comment
 
 
# RE: RE: RE: RE: Screwlaxo 2011-12-13 17:05
As much as I would like to think that these screws are a bad batch, its just insane. anyways I tried to squeeze the chipped screw head with my thumb and my pointer finger and guess what it broke again, now I have a quarter of the screw head, for some reason, I don't know why this happened, the only problem that I had with this card is I had my temp idle on 60C and load at 98C on BF3 ultra settings. so I tried to investigate and got to the loose screw, and no replies from power color if I can have a set of screws, my temps are normal right now. I just need the screws.
Report Comment
 
 
# RE: RE: RE: RE: RE: ScrewDavid Ramsey 2011-12-13 17:13
I guarantee you those are not custom screws made for that specific card. Again, try a hobby shop or hardware store. Worse case you could get an entire new third party cooler...
Report Comment
 
 
# RE: RE: RE: RE: RE: RE: Screwlaxo 2011-12-13 17:28
will do just that, thanks again!
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews Archive