Archive Home arrow Reviews: arrow Video Cards arrow ZOTAC GeForce 9800 GTX+ Zone Edition Video Card
ZOTAC GeForce 9800 GTX+ Zone Edition Video Card
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Monday, 01 September 2008

ZOTAC GeForce 9800 GTX+

It used to be that if you wanted to overclock your hardware, you ran the risk of voiding a warranty. Supposing that you dared to accept that risk, your options were still limited by the cooling equipment available to you. So it stands to reason that ZOTAC is making huge strides with their ZONE series, which combines a silent liquid-cooled system to a 55nm GeForce video card. The AMP! Series is already a fan favorite because of the extreme speeds at which they are factory overclocked, and the ZONE series adds to this a dramatically improved water-cooling solution for the best performance with no worries of heat or noise. Benchmark Reviews tests the ZOTAC GeForce 9800 GTX+ ZONE Edition G92 video card against a comprehensive collection of competitors in this article.

Powered by the newly revamped 55nm NVIDIA G92 graphics processor originally introduced with the GeForce 8800 GT series, the ZOTAC GeForce 9800 GTX+ ZONE Edition video card takes the GeForce family one step higher. The PCI Express 2.0 interface sends data at rates up to 5.0 GBps, which then uses the memory bus to build a 512 MB video frame buffer for smoother performance and realistic textures in PC games. The 1100 MHz GDDR3 video memory (2200 MHz DDR) on the GTX+ communicates with the 740 MHz G92 graphics processor through a 256-bit memory interface. For an extra performance boost during intense gaming situations, NVIDIA has designed the GTX+ to offer 128 stream processors operating at 1836 MHz.

Benchmark Reviews has already tested the Foxconn GeForce 9800 GTX OC Edition and ZOTAC GeForce 9800 GTX, so now the big question remains: what's does a 55nm fabrication process really change? That's what you and I both want to know, and in a few more pages, we'll have our answer.

Zotac_9800-GTX+_Zone_Kit.jpg

Get in the gaming zone with the ZOTAC GeForce 9800 GTX+ ZONE Edition's stealthy water-cooling system. The ZOTAC GeForce 9800 GTX+ ZONE Edition operates quietly, leaving your ears to focus on the gaming environment, immersing you into a virtual world with outstanding visual realism and blazing-fast frame rates in the latest DirectX 10 and OpenGL 2.1 3D games and applications.

Second-generation PureVideo HD technology empowers the ZOTAC GeForce 9800 GTX+ ZONE Edition with high-definition Blu-ray playback capabilities, freeing up your system processor for less mundane tasks. PureVideo HD technology paired with the ZOTAC GeForce 9800 GTX+ ZONE Edition's stealthy operation provides you with a visually stunning video experience that is uninterrupted by system noise, letting your ears immerse in the surround sound effects in high-definition Blu-ray movies instead.

About the Company: ZOTAC International (MCO) Limited

ZOTAC International (MCO) Limited was established in 2006 with a mission to deliver superb quality of NVIDIA graphic solutions to the industry. It has strong backup from parent group, PC Partner Ltd. Headquartered in Hong Kong, factory in mainland China and regional sales offices in Europe, Asia Pacific and North America. The support ZOTAC provides is currently the largest of its kind around the world.

With 40 SMT lines, 6,000 workers and 100,000 square-feet meter, ZOTAC features a full array of state-of-the-art facilities and machinery. In addition, ZOTAC has over 130 R&D professionals in Hong Kong, China and warranty and service center in strategic countries to enable effective and efficient worldwide as well as localized sales and marketing supports.

ZOTAC with NVIDIA not only means superb quality, it also means high performance, absolute reliability and great value. In the past year, ZOTAC was compared and tested by several influential members in the media and have proven its products are good quality, worth-to-buy graphic cards in the market. With the product features of overclocked performance, excellent cooling properties and unique packaging ZOTAC products definitely exceed users' expectations.

ZOTAC_Logo_600px.jpg

ZOTAC's commitment to our user is to bring the latest products quickly to the market with the best value. Doubtless to say ZOTAC is the right choice for those who require high-quality graphic solutions. For additional information please visit the ZOTAC website.

What's New In GTX+?

I'm sure this is the most anticipated question for this article, so I won't make you wait for the answer: die shrink. When NVIDIA originally launched the G92 graphics processor in the GeForce 8800 GT series, TSMC (their chip manufacturer) was had just finished conversion towards the 65nm fabrication process. This was a notable improvement over the 80nm process of late-8th generation graphic cards. Later on however, it was anticipated that the NVIDIA GeForce GTX 280 video card would launch with a further-improved 55nm fabrication process. But this wasn't the case, and 65nm made a repeat showing.

So when TSMC finally polished off their production line for the new 55nm die size, there weren't any new products planned as part of a launch event. Thus the GTX+ was born. But that's not the whole story (okay, maybe it really it is). Although the term PhysX existed prior to the time NVIDIA launched their 9th generation of Geforce products (beginning with the GeForce 9600 GT), not many people had much use for Ageia's product. Then NVIDIA went and bought up acquired Ageia, and began their vested interest in PhysX.

What is PhysX?

The GPU is no longer just about games, and games are no longer just about graphics. Physics is the new frontier for gaming, and GPU computing is the new frontier for visual computing. Only NVIDIA GPUs offer the trinity of graphics, physics, and compute.

The PC market evolves, so does PC gaming. There are steps which bring the experience to a whole new level. The first big thing for PC gaming was 3D hardware acceleration, followed by programmable shaders. The next big step is massive physics computation. NVIDIA PhysX is the ideal platform for game developers to enhance their games with dynamic, real-time physics, bringing the gaming experience to a new level. PhysX supports all the major gaming platforms on the market today and is the only engine for the PC that can make use of the massive parallel computation power of the GPU.

Physics is the next big thing in gaming. Without physics the world is static, indestructible, and lifeless. With physics, the world comes to life: walls can be torn down, trees bend in the wind, and water flows with body and force. It's such a fundamentally important subject that Ageia dedicated their whole company to high-fidelity physics, creating PhysX technology, the most popular physics API in the world with over 140 shipping titles, used by more than 10,000 registered users, and supported on Sony Playstation 3, Microsoft Xbox 360, Nintendo Wii, and the PC. Since NVIDIA acquired Ageia, we have been working hard to bring PhysX on the graphics processing unit (GPU). With the GTX+ launch, PhysX is is getting a lot more attention.

Delivering physics in games is no easy task. PhysX technology is an extremely compute-intensive environment based on a unique set of physics algorithms that require tremendous amounts of simultaneous mathematical and logical calculations. This is where NVIDIA GeForce processors come in. With the NVIDIA GeForce 177.79 driver (WHQL candidate at the time of this writing), GeForce 8, 9, and GTX200 Series graphics processors become physics processors too. This new capability extends physics simulation beyond the limited capabilities of the CPU, enabling incredible performance scalability by leveraging the power of the many-core graphics processor.

PhysX is designed specifically for hardware acceleration by powerful processors with hundreds of processing cores. Powered by the tremendous parallel processing capability of the GPU, PhysX will provide a dramatic increase in physics processing power, and will take gaming to a new level delivering rich, immersing physical gaming environments with features such as:physx.jpg

  • Explosions that create dust and collateral debris
  • Characters with complex, jointed geometries, for more life-like motion and interaction
  • Spectacular new weapons with incredible effects
  • Cloth that drapes and tears naturally
  • Dense smoke & fog that billow around objects in motion

More information on PhysX is available from the NVIDIA PhysX FAQ web page.

9800 GTX+ Features

Backed by NVIDIA's Lumenex Engine, the GeForce 9800 GTX+ delivers true 128-bit floating point high dynamic range (referred to as HDR), lighting capabilities with up to 16x full-screen anti-aliasing. Second-generation NVIDIA PureVideo HD technology with HDCP compliance delivers the ultimate high-definition video viewing experience to the NVIDIA GeForce 9800 GTX+ video card.

With hardware decoding for Blu-ray and HD DVD formats, PureVideo HD technology lowers CPU utilization when watching high-definition video formats by decoding the entire video stream in the graphics processor, freeing up the processor for other tasks. In addition to low CPU utilization, PureVideo HD enhances standard definition video content with de-interlacing and other post-processing algorithms to ensure standard DVD movies look their best on the PC screen and high-definition television sets. High definition content protection, or HDCP, technology ensures a secure connection between the GTX 280 graphics card and an HDCP capable monitor for viewing protected content such as high-definition Blu-ray or HD DVD movies.

Coupled with PureVideo HD technology, the GeForce 9800 GTX+ delivers the ultimate multimedia experience. HDMI technology allows users to connect PCs to high-definition television sets with a single cable, delivering high-definition surround sound audio and video with resolutions up to 1080p. PureVideo HD technology scales video in the highest quality up to resolutions of 2560x1600 - from standard and high-definition file formats - while preserving the details of the original content. PureVideo HD technology also accelerates high-definition video decode, freeing up CPU cycles while watching high-definition Blu-ray and HD DVD movies or other VC-1 and H.264 encoded file formats.

NVIDIA Unified Architecturepurevideo.jpg

  • Unified shader architecture
  • GigaThread technology
  • Full support for Microsoft DirectX 10
  • Geometry shaders
  • Geometry instancing
  • Streamed output
  • Shader Model 4.0
  • Full 128-bit floating point precision through the entire rendering pipeline

NVIDIA Lumenex Enginepurevideo_hd_logos.jpg

  • 16x full screen anti-aliasing
  • Transparent multisampling and transparent super-sampling
  • 16x angle independent anisotropic filtering
  • 128-bit floating point high dynamic-range (HDR) lighting with anti-aliasing
  • 32-bit per component floating point texture filtering and blending
  • Advanced lossless compression algorithms for color, texture, and z-data
  • Support for normal map compression
  • Z-cull and Early-Z

NVIDIA Quantum Effects Technology

  • Advanced shader processors architecture for physics computation
  • Simulate and render physics effects on the graphics processor

NVIDIA Triple-SLI Technology

  • Patented hardware and software technology allows three GeForce-based graphics cards to run in parallel to scale performance and enhance image quality on today's top titles.

NVIDIA PureVideo HD Technologywith_purevideo.jpg

Along with world-class video acceleration, PureVideo HD has been at the forefront of advanced video post-processing. With the R174 series driver, we are introducing new features for PureVideo HD for GeForce GTX 200-based products. These new features, Dynamic Contrast Enhancement and Dynamic Blue, Green, and Skin Tone Enhancements, are extremely computationally intensive and not found on even the most high-end Blu-ray or HD DVD players. But by tapping into the enormous pool of computational power offered by our processor cores, we can now enable post-processing techniques that have yet to be realized in fixed-function video processors.

  • Dedicated on-chip video processor
  • High-definition H.264, VC-1, MPEG2 and WMV9 decode acceleration
  • Advanced spatial-temporal de-interlacing
  • HDCP capable3
  • Spatial-Temporal De-Interlacing
  • Noise Reduction
  • Edge Enhancement
  • Bad Edit Correction
  • Inverse telecine (2:2 and 3:2 pull-down correction)
  • High-quality scaling
  • Video color correction
  • Microsoft Video Mixing Renderer (VMR) support

Advanced Display Functionality

  • Two dual-link DVI outputs for digital flat panel display resolutions up to 2560x1600
  • One dual-link DVI outputs for digital flat panel display resolutions up to 2560x16004
  • Dual integrated 400MHz RAMDACs for analog display resolutions up to and including 2048x1536 at 85Hz
  • Integrated HDTV encoder provides analog TV-output (Component/Composite/S-Video) up to 1080i resolution
  • NVIDIA nView multi-display technology capability
  • 10-bit display processing

Dynamic Color Enhancement

By analyzing the color components of each frame, we can also isolate and improve the appearance of blue, green, and skin tones, which the human eye is particularly sensitive to. Unlike televisions which have built-in image processors, PC monitors typically display the input picture without any processing, which can result in comparatively dull images. Dynamic blue, green, and skin tone enhancement alleviates this problem by applying correction curves on these sensitive colors. The result is improved total balance and clarity, without over saturation.

Built for Microsoft Windows Vistapurevideo_ecosystem.jpg

  • Full DirectX 10 support
  • Dedicated graphics processor powers the new Windows Vista Aero 3D user interface
  • VMR-based video architecture

High Speed Interfaces

  • Designed for PCI Express x16
  • Designed for high-speed GDDR3 and DDR3 memory

Operating Systems

  • Built for Microsoft Windows Vista
  • Windows XP/Windows XP 64
  • Linux

API Support

  • Complete DirectX support, including Microsoft DirectX 10 Shader Model 4.0
  • Full OpenGL support, including OpenGL 2.0

NVIDIA Hybrid SLI

Benchmark Reviews learned of Hybrid SLI during our time with NVIDIA at the 2008 International CES. I thought that seeing the Stereoscopic 3D Gaming demonstration would be the highlight of NVIDIA's offerings, but then the following morning they proved to have at least one more trick up their sleeve. At CES we were privileged to see Hybrid SLI make it's formal release.

NVIDIA announced the industry's first hybrid technology for PC platforms-Hybrid SLITM-that addresses two critical issues: increasing graphics performance and reducing power consumption. NVIDIA Hybrid SLI technology will be incorporated into a wide variety of graphics and motherboard desktop and notebook products that the Company is rolling out for both AMD and Intel desktop and notebook computing platforms throughout 2008.

"From the introduction of programmable GPU's to the rapid adoption of our multi-GPU SLI technology, NVIDIA has repeatedly pioneered and innovated to solve difficult problems for the industry. We believe Hybrid SLI technology is one of the most important innovations we've come up with to date," said Jen-Hsun Huang, CEO of NVIDIA. "Hybrid SLI delivers new multi-GPU technology to a large segment of the PC market, delivering consumers a level of PC graphics performance and power efficiency never before seen."

NVIDIA Announces Hybrid SLI Multi-GPU Technology

First disclosed in June 2007, NVIDIA Hybrid SLI technology is based on the Company's market-leading GeForce graphics processor units (GPUs) and SLI multi-GPU technology. Hybrid SLI enables NVIDIA motherboard GPUs (mGPUs) to work cooperatively with discrete NVIDIA GPUs (dGPUs) when paired in the same PC platform. Hybrid SLI provides two new technologies - GeForce Boost and HybridPower - that allow the PC to deliver graphics performance for today's applications and games when 3D graphics horsepower is required, or transition to a lower-powered operating state when not.

NVIDIA HybridPower

For lower energy consumption and quieter PC operation, HybridPower allows the PC to switch processing from a single GPU or multiple GPUs in SLI configuration to the onboard motherboard GPU. HybridPower is most useful in situations where graphics horsepower is not required, such as high definition movie playback on a notebook platform or simple e-mail or Internet browsing on a desktop. It is also beneficial for those users who want a quiet operating state with reduced thermals and noise. For notebooks, HybridPower can also dramatically extend battery life by up to 3 hours. When a game or application is started that requires the additional 3D horsepower, the PC can automatically transition back to the discrete graphics cards and power up the 3D capabilities all transparent to the end user.

In applications where 3D performance is required, GeForce Boost turbo-charges 3D operation by combining the processing power of the traditional NVIDIA GeForce-based graphics card with that of the second GPU integrated into the motherboard core logic. In media-rich applications, both GPUs work in tandem to render the combined images with the end user benefiting from the increase in performance and frame rate. For typical games and 3D applications, GeForce Boost can kick in automatically resulting in a greatly enhanced consumer experience.

Innovative Multi-GPU Technology Raises Performance, Reduces Power Consumption for PC Graphics

When coupled with a HybridPower-enabled motherboard, the ZOTAC GeForce 9800 GTX+ ZONE Edition video card can be powered down completely. For everyday computing and watching HD movies, the motherboard GPU is used and the 9800 GTX+ can be turned off, consuming no power at all. When an intensive 3D application is engaged, users can turn on the GeForce 9800 GTX+ for maximum performance. HybridPower works by sending the output of the discrete GPU through the output connector on the motherboard. This allows the system to use both GPUs as it sees fit without physically changing the connector.

Hybrid Multi-GPU Technology Raises Performance, Reduces Power Consumption for PC Graphics

NVIDIA is the recognized market leader for GPU desktop and notebook solutions for both Intel and AMD platforms and has a full lineup of Hybrid SLI-capable graphics and motherboard products planned for 2008. New Hybrid SLI-capable products include the upcoming NVIDIA nForce 780a SLI, nForce 750a SLI, and nForce 730a media and communication processors (MCPs) for AMD CPUs, which will be released next month, as well as the new GeForce 8200-the industry's first micro-ATX motherboard solution with an onboard Microsoft DirectX 10-compliant motherboard GPU. NVIDIA Hybrid SLI notebooks as well as desktop products designed for Intel CPUs will be available next quarter. Look for Hybrid SLI to make its way into everything NVIDIA produces from this point forward.

9800 GTX+ Specifications

Coupled with PureVideo HD technology, the ZOTAC GeForce 9800 GTX+ ZONE Edition graphics card delivers an astounding multimedia experience. The GeForce 9800 GTX+ features two dual-link, HDCP-enabled DVI-I outputs for connection to analog and digital PC monitors and HDTVs, a 7-pin analog video-out port that supports S-Video directly, plus composite and component (YPrPb) outputs via an optional (and included) dongle.

  • ZOTAC ZT-98PES2P-WSP GeForce 9800 GTX+ 512MB Video Card
  • S/PDIF Digital audio cable included
  • HDMI resolutions: 480p/720p/1080i/1080p
  • PCI Express 2.0 interface
  • 128 Total stream processors
  • Dual card-slot active cooling solution
  • 512 MB Total GDDR3 memory
  • 256-bit memory interface
  • PureVideo HD technology with hardware decoding of high-definition video formats
  • Dual dual-link DVI - up to 2560x1600
  • DVI HDTV output: 480p/720p/1080i

Bus SupportGeForce_GTX-200_GPU_Silicon_Die.jpg

  • PCI Express 2.0
  • PCI Express x16 Backwards Compatible

3D Acceleration

  • Microsoft DirectX10 support
  • Unified Shader Model 4.0
  • OpenGL 2.1

Others

  • HDTV Ready
  • Vista Ready
  • SLI Ready
  • HDCP Ready
  • DVI Audio
  • Dual Link Dual DVI
  • RoHS Compliant

Dual-Stream Decode

Recently, studios have begun taking advantage of the additional space high-definition media such as Blu-Ray and HD DVD discs provide by adding dual-stream picture-in-picture functionality to movies. Often the PiP content is coupled with advanced BD-J (Java) or HDi (XML) features, so taking the processing burden off of the CPU is even more important for titles with these advanced features. The latest PureVideo HD engine now supports dual-stream hardware acceleration which takes the workload off of the CPU and gives it to the more powerful GPU.

G92 Graphics Processing Unit

  • NVIDIA GeForce 9800 GTX+ 740 MHz Graphics Engine
  • 128 Stream Processors
  • 1836 MHz Shader clock
  • 400 MHz RamDAC
  • Max. Resolution @ 2560 x 1600
  • True 128-bit floating point high dynamic-range (HDR) lighting with 16x full- screen anti-aliasing

Memory

  • 512MB GDDR3 vRAM
  • 1100 MHz memory clock (2200 MHz realized)
  • 256-bit memory bus
  • Mem Type: 16Mx32-1.0 GDDR3
  • Memory pieces: 8
  • Memory package: uBGA

HDCP over dual-link allows video enthusiasts to enjoy high-definition movies on extreme high-resolution panels such as the 30" Dell 3007WFP at 2560 x 1600 withno black borders. The GeForce 9800 GTX also provides native support for HDMI output, using a certified DVI-to-HDMI adaptor in conjunction with the built-in SPDIF audio connector.

Aero with HD DVD and Blu-ray Playback

Until now, users have been unable to take advantage of the Aero user interface in Windows Vista while playing HD video. When this was attempted, Vista would revert back to a basic theme and Aero would be disabled.
PureVideo HD now supports HD movie playback in Aero mode. This creates a more seamless user experience by eliminating the pop-up message notifying that Vista has switched to basic mode. As you can see in the screenshot below, Aero windows are enabled in conjunction with HD movie playback.

With HDMI support the GeForce 9800 GTX+ based graphics solution is among the fastest graphics card available, and when paired with a 7 Series NVIDIA nForce motherboard (such as the ASUS Striker II NSE nForce 790i SLI Motherboard), creates the latest in a line of powerful NVIDIA gaming platforms. Be blown away by scorching frame rates, true-to-life extreme HD gaming, and picture-perfect Blu-ray and HD DVD movies.

ZONE Edition First Look

It isn't very often that you'll hear me say something like "there isn't anything else like it in the world". In fact, I've been at this for over two years and never once have I said that about video cards. For many years now the graphics card industry has changed only the skin but never the core. As a result, video cards have remained the same more or less. Not anymore: ZOTAC changes the landscape with their ZONE Edition liquid-cooling solutions.

The ZONE Edition is an idea that becomes reality with a few critical changes to the reference design. First ZOTAC takes the PCB section of a GeForce 9800 GTX+ and removes the heatsink and fan cooling component. Then they add their own cooling craftsmanship in the form of a full-length heatsink water block and pump. An aluminum radiator with an attached fan has inlet and outlet hoses tracing back to the GPU, forming a full self-contained liquid-cooling system for the video card. In this section and the next, we'll cover each of the components in full detail.

Zotac_9800-GTX+_Zone_Package.jpg

The ZOTAC GeForce 9800 GTX+ ZT-98PES2P-WSP uses a dual-slot design that appears very similar in shape to NVIDIA's reference design, but adds a small radiator core and hoses to the unit. ZOTAC's9800 GTX comes in one color: black. This little gem was tough to photograph, so don't be too upset with my images.

In the image below you would be keen to notice two SLI connections. NVIDIA has designed the GeForce 9800 GTX to operate in a 3-way SLI configuration, which they have tested to be faster than a pair of GeForce 9800 GX2 cards in Quad SLI in certain applications and resolutions For some applications, the GeForce 9800 GTX placed into a 3-way SLI set will be faster than a set of Quad SLI GeForce 9800 GX2's. The big question gamers and hardware enthusiasts will need answer for themselves is if their configuration is will support this functionality in terms of power supply, case, and cooling.

Zotac_9800-GTX+_Zone_Kit_Top.jpg

The ZOTAC 9800 GTX+ ZONE Edition Video graphics card is a performance optimized high-end card on every level. Power is taken from the PCI Express host bus as well as from two 6-pin PCI Express power connectors. Without any auxiliary power provided to the GeForce 9800 GTX+ graphics card, an LED on the bracket (lower-right corner in the image below) will shine red and the graphics card will not allow the system to boot (for most motherboards). In addition, the 6-pin connection port that is not adequately powered will also turn red. Together this new functionality offers immediate feedback for enthusiasts concerned about providing adequate power to the GPU. In the past, low/no auxiliary power situations sounded a piezo buzzer which was so loud you could often mis-located the origin of the alarm.

GeForce_9800_GTX+_I-O_Panel.jpg

The HDMI functionality is a new direction for NVIDIA graphic cards, which helps extend their presence into HTPC's as well as desktops. ZOTAC has included everything you might need to get the GeForce 9800 GTX+ up and running, as well as connecting it to D-Sub or DVI monitors, component video for CRT monitors and older TV's, and HDMI HDTV's via the DVI-to-HDMI adapter. Although two power connection adapters are included, it's not a good idea to overload older power supplies that lack the native PCI-Express connection plugs.

ZT-98PES2P-WSP_Accessories.jpg

Because the HDMI audio functionality is controlled at a hardware level, there is no need for additional drivers or software. Much like the SPDIF connection on the back of a motherboard, the video cards audio out function is plug-n-play. The P/SPDIF cable included with the kit connects between a small two-pin port on the power-connection side of the unit (near the green GeForce chevrons) and the HT Omega Claro Plus+ AD8620BR Op Amp sound card we used for testing. Your setup may be different, so the cable may connect between the 9800 GTX and either your motherboard or sound card digital input pins. Not all motherboards and sound cards support this option, so make sure it's available before you make your purchase. The 9800 GTX, unlike previous generation NVIDIA cards, is equipped with the PureVideo 2 engine for GPU assisted decoding of the H.264 and VC-1 CODEC's.

ZT-98PES2P-WSP_Media_Bundle.jpg

In past ZOTAC AMP! Edition product reviews I have discovered several interesting game titles enclosed with the graphics card. ZOTAC is doing well to continue their new tradition, and include the PC video game XIII Century with the ZONE 9800 GTX+. I'm not a huge fan of medevil RPG's, but it's always nice to have a free introduction to a genre I might enjoy.

Please continue on to the next section where Benchmark Reviews takes a detailed look at the ZOTAC ZT-98PES2P-WSP GeForce 9800 GTX+ ZONE Edition 512MB video card.

ZONE Detailed Features

In our last section, we skimmed over the outer skin of ZOTAC's new ZONE Edition GeForce 9800 GTX+. With a basic understanding of what you'll get on the outside, we're ready to get inside the product and dissect the technology. This information will be very helpful for those hardware enthusiasts and overclockers willing to void their warranty and potentially ruin this expensive product in order to tweak it's electronics. This information is for entertainment purposes only, and not a recommendation to disassemble your product or perform modifications.

ZOTAC uses the GeForce 9800 GTX+ as a reference for their ZONE Edition of the product. Although it may appear very similar in the overall appearance, the ZONE Edition borrows little more than the PCB from NVIDIA graphics card. By looking closely as the image below, I began to wonder if ZOTAC could have turned the GeForce 9800 GTX+ into a single card-slot size product. Perhaps the pump would be the biggest obstacle, because the remainder of the graphics card looks like it could easily drop some empty space.

After removing four flat-head screws that fastened a thin aluminum plate over the top of the unit, the 'inner' components were revealed.

Zotac_9800-GTX+_Zone_Side.jpg

From the two images above an below, you can see that the ZOTAC ZONE Edition is a well-engineered specimen and not just a novel concept. I especially like the way ZOTAC used a large heatsink across the entire PCB to cool memory components and electronics. Nevertheless, I find myself with a few small criticisms in regard to the design. To start, the short inter-connect hoses stand out from the GeForce 9800 GTX+ unit, which could have been avoided if the GPU water-block and pump unit were positioned to face each other. It's also unfortunate that this heatsink is relegated to passive cooling, and doesn't benefit by the liquid-cooling solution.

Zotac_9800-GTX+_Zone_Top.jpg

Identical to the previous GeForce 9800 GTX, the GTX+ offers no cooling enhancements to the backside (topside when installed) of the PCB... and for good reason. Since the GTX+ makes the move into a 55nm fabrication space, the thermal envelope is much easier to control. I will admit that liquid cooling may not have been necessary on the GTX+, but cooler is better in the world of electronics.

GeForce_9800_GTX+_PCB_Bottom.jpg

Not to rehash an already old topic, but the GeForce 9800 GTX (G92) GPU was manufactured by TSMC using 65nm technology, employing a total of 754 million transistors to command 128 processors cores operating at 1688 MHz. The sole innovation attributed to the 9800 GTX+ series is the refined fabrication process to 55nm technology. There are still 128 cores available, with each processor capable of being dynamically allocated to vertex, pixel, and geometry operations for the utmost efficiency in GPU resource allocation, and maximum flexibility in load balancing shader programs. The newly minted GTX+ comes with a higher reference shader speed of 1836 MHz.

ZOTAC_GeForce_9800_GTX_Bare.jpg

Working alongside the processors cores are 64 texturing processors (eight texture processors per shader block) each capable of one addressing and filtering operation per clock. With a peak bilinear fill rate of 43.2 gigatexels, it offers unprecedented texturing performance for any GPU. The G92 chip features sixteen render back-end units (ROP) with full support for 128-bit high-dynamic-range rendering and NVIDIA's exclusive 16x Coverage Sampling Antialiasing (CSAA) algorithm. The ROP compression system has also been enhanced to improve performance at extreme resolutions such as 2560 x 1600. The enhanced compression will help keep memory usage in check and improve performance in high resolution, antialiased scenarios.

754 Million Transistors are etched through a 55nm process and all live inside the small square G92 GPU. A 256-bit memory bus allows the GeForce 9800 GTX+ to offer 512 MB of GDDR3 to the G92 GPU, and gamers understand the importance of a fast video frame buffer. A total of eight Samsung GDDR3 modules line the outer perimeter of the printed circuit board, bearing the Samsung 807 K4J52324QE-BJ08 part number. Hardware enthusiasts should note that these same vRAM modules were also used on late-edition GeForce 8800 Ultra's. Perhaps NVIDIA missed their opportunity to introduce GDDR5 with the GTX+ launch?

ZOTAC_GeForce_9800_GTX_GDDR3.jpg

This concludes our in-depth look into the ZOTAC ZT-98PES2P-WSP, which has revealed several interesting discoveries about the hardware and the assembly process. The 9800 GTX+ ZONE Edition is a good-looking graphics card, but from here on out this product will have to put up some impressive results or be put down. In our next section, Benchmark Reviews begins testing on the GeForce 9800 GTX+ video card, but only after we spend some time explain how it's all done here in our lab.

Testing Methodology

At the start of all our testing, the previous display adapter driver is uninstalled and remnant files are removed using Driver Cleaner Pro. We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. The synthetic benchmark tests in 3DMark06 will utilize shader models 2.0 and 3.0. In our higher-end VGA products we conduct tests at the following resolutions: 1280x1024 (Standard 17-19" LCD), 1680x1050 (22-24" Widescreen LCD), and 1920x1200 (24-28" Widescreen LCD). In some tests we utilized widescreen monitor resolutions, since more users are beginning to feature these products for their own computing.ZOTAC_9800_GTX+_GPU-Z.png

Each benchmark test program begins after a system restart, and the very first result for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict and Supreme Commander benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, with the average results displayed in our article.

Our site polls and statistics indicate that the over 90% of our visitors use their PC for playing video games, and practically every one of you are using a screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, I feel that this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. Since most gamers and enthusiasts are still using Windows XP, it was decided that DirectX 9 would be used for all tests until demand and software support improve for Windows Vista.

Test System

Benchmark Applications

  • 3DMark06 v1.1.0 (8x Anti Aliasing & 16x Anisotropic Filtering)
  • Call of Duty 4: Modern Warfare v1.7.568 (4x AA/16x Trilinear AF using FRAPS v2.9.4 Build 7037)
  • Crysis v1.21 Benchmark (High Settings, 0x and 4x Anti-Aliasing)
  • Unreal Tournament 3 v1.3 (High Quality, 16x Anisotropic Filtering using HOC Benchmark Tool v1.3)
  • World in Conflict v1.0.0.9 Performance Test (Very High Setting: 4x AA/4x AF)

Video Card Test Products

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

Now we're ready to begin testing video game performance on the GeForce 9800 GTX+, so please continue to the next page as we start with the 3DMark06 results.

3DMark06 Test Results

3DMark is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 9 performance of 3D game performance with graphics cards. 3DMark06 uses advanced real-time 3D game workloads to measure PC performance using a suite of DirectX 9 3D graphics tests, CPU tests, and 3D feature tests.

3DMark06 tests include all new HDR/SM3.0 graphics tests, SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today. Some enthusiasts may note that Benchmark Reviews does not include CPU-bound tests in our benchmark battery, and that only graphic-bound tests are included.

Here at Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, I believe 3DMark is a very reliable tool for comparing graphic cards against one-another.

More visitors to Benchmark Reviews operate at 1280x1024 resolution than any other, as it represents the native resolution of 19" LCD monitors. Using this resolution as a starting point, the maximum settings were applied to 3dMark06 which for these tests include 8x Anti-Aliasing and 16x Anisotropic Filtering. Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components to keep up. At the lower resolutions 3DMark will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in measuring the maximum output performance in the test results.

3DMark06_1280x1024.jpg

Right away our test results indicate that the Shader Model 2.0 benchmarks in 3dMark06 really prefer the GeForce products, while the HDR/Shader Model 3.0 tests favor the Radeon HD 4850 and 4870 video cards. Whenever I see this phenomenon, I immediately recall my analogy to automobiles. Think of the GeForce series as a car, capable of reaching very high top speed so long as the load is light and the road is level. For the Radeon series, ATI's cards are more like a truck in that they can haul a load uphill at a much better pace than the competition. Part of me wants to believe this is a strait-forward statement that will always hold true, but that part of me succumbs to knowing that games and synthetic benchmarks are not built the same.

In the SM 2.0 tests the ZOTAC GeForce 9800 GTX+ stays just ahead of the older AMP!ed version of the 8800 GT, and slightly more ahead of the Radeon HD 4850. However, moving into the more advanced HDR tests the same Radeon HD 4850 that was trailing behind now arrives at a substantial lead.

3DMark06_1680x1050.jpg

At the widescreen resolution of 1680x1050, the scores are practically identical in terms of comparison ratio to all of our previous tests. Once again the shader model 2.0 tests put the GeForce 8800 GT and 9800 GTX+ ahead of the Sapphire Radeon 4850 with the GTX 260 way ahead of the Radeon HD 4870. This is the case until they reach the shader model 3.0 tests, where everything changes. The Sapphire Radeon HD 4870 video card begins to suddenly run circles around the GeForce competition in every 3dMark06 benchmark we run. The Sapphire Radeon HD 4850 surpasses the ZONE Edition ZOTAC GeForce 9800 GTX+ while the 4870 reverses the margin between it and the GTX 260.

3DMark06_1920x1200.jpg

Finishing up the series of synthetic benchmark tests under heavy load, the tests appear to have kept with the previous trend of results. In the Shader Model 2.0 tests the Radeon HD 4850 matches itself to the AMP!ed GeForce 8800 GT, while the GeForce 9800 GTX+ enjoys a very minor lead. The Radeon HD 4870 keeps a respectable distance ahead of the others, creating a product separation step that appears obvious in the chart, but not strong enough to keep pace with the GTX 260. In the more modern and demanding Shader Model 3.0 tests everything changes... again. The Radeon HD 4870 pushes a healthy lead over the entire group, while the GTX 260 drop to a performance level barely ahead of the Radeon HD 4850. In it's own regard, the Radeon HD 4850 manages to pull out a very good margin over the GeForce 9800 GTX+, which is barely ahead of the highly overclocked 8800 GT.

Two things were made very clear with the 3dMark06 benchmark tests: NVIDIA performs well in older technology tests, while the ATI Radeon series redefines the high-end of graphics performance in newer technology tests.

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

Take the 3DMark06 tests at face value, because in our next section we begin real-world testing on a cadre of popular video games known for taxing the graphics processor, and the performance curve is expected change. Our first up is Call of Duty 4, so please continue on...

Call of Duty 4 Benchmarks

Call of Duty 4: Modern Warfare runs on a proprietary game engine that Infinity Ward based off of the tried-and-true Q3 structure. This engine offers features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth of field. "Bullet Penetration" is calculated by the Infinity Ward COD4 game engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as the meager protection provided by things such as wooden fences and thin walls does not fully shield players from harm as it does in many other games released during the same time period. Bullet speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated.

This version of the game also makes use of a dynamic physics engine, a feature which was not implemented in previous Call of Duty titles for Windows PC's. The new in-game death animations are a combination of pre-set static animations combined with ragdoll physics. Infinity Ward's use of the well-debugged Quake 3 engine along with new dynamic physics implementation allows Call of Duty 4 to be playable by a wide range of computer hardware systems. The performance may be scaled for low-end graphic cards up to 4x Anti-Aliasing and 16x Tri-linear anisotropic texture filtering.

Before I discuss the results, I would like to take a moment to mention my general opinion on Fraps software when it comes to game performance benchmarking. If you're not familiar with the software, Fraps (derived from Frames per second) is a benchmarking, screen capture, and real-time video capture utility for DirectX and OpenGL applications. Some reviewers use this software to measure video game performance on their Windows system, as well as record gaming footage. My opinion is that it offers a valid third-party non-bias alternative to in-game benchmarking tools; but there is one caveat: it's not perfect. Because the user must manually begin the test, the starting point may vary from position to position and therefore skew the results.

In my testing with Fraps v2.9.4 build 7039, I used the cut-scene intro to the coup d'état scene when Al Asad takes over control. First I allowed the level to load and let the scene begin for a few moments, then I would use the escape key to bring up the menu and choose the restart level option, I would immediately press F11 to begin recording the benchmark data. This scene is nearly four minutes long, but I configured Fraps to record the first 180 seconds of it to remain consistent. Once the scene would end, I would repeat the restart process for a total of five tests. So within a 2 millisecond starting point margin, all benchmark results are comparable which is probably as good as it can possibly get with this tool.

COD4_FRAPS_Benchmark.jpg

In our frame rate results, all five of the test scores collected were within 0.5 FPS of one-another. The average of the results were rounded to the nearest whole number for the chart you see above. Call of Duty 4 showed some small degree of difference in graphics performance at the lower resolution of 1280x1024, but it tapered out thereafter for both the 1680x1050 and 1920x1200 resolutions.

When the frame rate results are reviewed under close inspection, one of the most popular games of 2008 indicates that an overclocked GeForce 8800 GT is really not much different than the newest 9800 GTX that costs twice as much. Even when you compare the results of the Radeon HD 4850 which cost roughly the same, there isn't a compelling argument for the GTX+ series. Possibly making matters appear even more bleak for NIVIDIA is the Radeon HD 4870 that matches performance of the slightly more expensive XFX GeForce GTX 260.

Call of Duty 4 put a reasonable amount of strain on the ZOTAC GeForce 9800 GTX+ ZONE Edition graphics card, and just because its a high-end card that appears to position itself at the low-end of our chart doesn't mean it didn't perform well for during our tests. Since the maximum anti-aliasing available in COD4 is 4x, there won't be any problem with the GeForce 9800 GTX+ playing this popular video game.

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

In our next section, we shall see if the performance-demanding video game Crysis will help NVIDIA prove the 9800 GTX+ is worth your consideration.

Crysis Benchmark Results

Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX10) framework of Windows Vista, but can also run using DirectX9, both on Vista and Windows XP.

Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.

Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.

The very first thing we discovered in the low-resolution tests was how seemingly poor both of our multi-GPU products performed. The Gigabyte GeForce 9800 GX2 was the lowest of the group at 1280x1024, matched in performance to a single Sapphire Radeon HD 4850 or GeForce 9800 GTX. The CrossFireX set of 4850's suffered the same rapid-response overhead bottleneck and performed almost the same as a single Radeon 4870 or GTX 260. To be fair, none of these video cards will probably ever realistically see game-play at a resolution this low, so this performance illustrates how high-end GPU power can be cut short if the monitor (resolution) doesn't match it.

Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components. At the lower resolutions Crysis will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in creating a baseline for measuring maximum output performance in the next few test results. At the 1280x1024 resolution used by 19" monitors, our results show that performance is beginning to really drop despite the small difference is pixels drawn. In terms of general performance, all of these products maintain the same performance ratio as before, except for the 9800 GX2 which seems to beneficially hold its ground.

Crysis_HQ_Benchmark_No-AA.jpg

Up to this point in our review, the GeForce 9800 GTX+ has trailed slightly behind the closest competition: ATI's Radeon HD 4850. Tested in Crysis with no post-processing effects added, these two video cards are neck and neck. Although it's good to know that one matches another, it's not so impressive that a nearly two-year-old GeForce 8800 GT that costs half as much can practically match performance of both the newer and more expensive cards. I've said this before several times, and I will say it again: ZOTAC's GeForce 8800 GT AMP! Edition video card is a budget beast!

Reading the test results at 1920x1200 resolution using SOYO's DYLM26E6 monitor for testing, Crysis forced 2.3 million pixels to be processed by our graphical test products. For our widescreen users, these benchmarks below indicate that the NVIDIA GeForce 9800 GTX+ is practically the same as an ATI Radeon HD 4850; likewise the Radeon HD 4870 matches the performance of NVIDIA's GeForce GTX 260 video card, although the 4870 stops delivering post-processing effects at 8x AA and the GTX 260 can reach 32x AA (if the application supports it). Before we leave Crysis though, I decided to include a look at post-processing performance with 4x AA enabled at the 1680x1050 and 1920x1200 widescreen resolutions. The chart below shows the average frame rate performance with 4x Anti-Aliasing enabled.

Crysis_HQ_Benchmark_4x-AA.jpg

At 1680x1050, the Radeon HD 4870 is no match for the GeForce GTX 260 it's outpaced thus far. Additionally, the AMP!ed GeForce 8800 GT crawls right up to the 9800 GTX+ which trails directly behind the slightly overclocked Sapphire Radeon HD 4850. But once the Honeywell 22-Inch LCD was swapped out for something with higher resolutions, I began testing at 1920x1200 and the differences were made very clear.

At 1920x1200 the G92 graphics processor falls flat on its face. At a lowly 13 FPS, the ZOTAC GeForce 9800 GTX+ ZONE Edition video card delivers performance far from playable. The Radeon HD 4850, while managing to the nearly double the performance with 20 FPS, still isn't anything to get excited about. Finally, at the high-end portion of our chart the Radeon HD 4870 delivers a decent 25 FPS while the GTX 260 delivers 31 FPS.

With only a small dose of anti-aliasing added to Crysis, only the GTX 260 came close to playable frame rates. Our Island timedemo mixes a some beach and water views so it's going to be on the high side of frame rates when compared to actual game play, but as you can see the Radeon products do very well when post-processing effects are added. Sapphire's hot-potato HD 4870 defended ATI's name as best as it could, but didn't fail nearly as bad as G92 GPU-based video cards. NVIDIA's G92-based GeForce 8800 GT and new 9800 GTX+ really stand out like a sore thumbs against the newer graphics processors, which yield far better frame rates in our Crysis testing.

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

In our next section, Benchmark Reviews tests with Unreal Tournament 3. Read on to see how a blended high-demand GPU test with low video frame buffer demand will impact our test products.

Unreal Tournament 3

Unreal Tournament 3 (UT3) is a first-person shooter and online multiplayer video game by Epic Games and is the next installment of the Unreal series after Unreal Tournament 2004. It is published by Midway Games and was released in North America for Windows on November 19, 2007.

Unreal Tournament 3 is actually the fourth game in the Unreal Tournament series and the eighth Unreal game, but it has been numbered in terms of the engine it runs on. UT3 is subsequently part of the third generation, because it runs on the Unreal Engine 3, and does not reuse any content from previous versions.

Since Unreal Tournament 3 was designed as a DirectX 9 video game with no current support expected for DirectX 10, we use Windows XP Pro (Service Pack 3) for our benchmark testing. After completing tests on a wide range of products with settings at their highest, it appeared that Unreal Tournament 3 really didn't stress the video cards nearly as much as I would have liked.

Remember my analogy from 3dMark06 that compared NVIDIA and ATI to automobiles? It's coming back around to prove itself a worthy theory. Beginning at the low resolution of 1280x1024, the benchmark scores are so close (and high) for some products that it might be time to eliminate this game from our testing process. Nevertheless, it looks like the Unreal Engine 3 game engine doesn't care too much for the ATI Radeon HD 4850 or 4870 video cards. With all High Quality settings and tweaks enabled along with 16x anisotropic filtering, Unreal Tournament 3 doesn't add strain to any of the graphics cards tested like Crysis did.

As the resolution was raised, the once level performance between the Radeon HD 4850 and 9800 GTX+ has split apart. The ZOTAC GeForce 8800 GT AMP! Edition still trails directly behind the GTX+, which should be expected since little more than 10nm or fabrication process and a few MHz separate these two products. For now it appears that just about any graphics card can play Unreal Tournament 3 without issue, but quite frankly I don't know anyone who actually plays this game.

Unreal_Tournament_III_Benchmark.jpg

When I tested the Honeywell HWLM2216 recently, I noticed how the 1680x1050 widescreen display resolution of this 22" LCD monitor offered very little strain over a 19" standard display LCD monitor. Comparatively, 1680x1050 produces 1.76 MP and 1280x1024 produces 1.31, so there's only a very small difference expected between performance levels. The biggest difference is in the user experience, because the widescreen monitor comes in very handy for watching multimedia video or playing large world-scape video games.

At 1680x1050 resolution, the differences were beginning to show, but only 1920x1200 will be useful for illustrating how each product performs. Producing 2.3 MP with a 1920x1280 resolution on our 26" SOYO DYLM26E6 test monitor, each product is now separated far enough apart to sort out the winners and losers... relatively speaking. The Sapphire Radeon HD 4870 still trails way behind the GeForce GTX 260, but the Radeon HD 4850 no longer runs ahead of the GeForce GTX+. Because UT3 utilizes older Shader Model 2.0 technology, the high-speed low-torque output familiar with GeForce video cards does very well to post prevailing results over the high-torque output of the Radeon 4000 series.

Similar to our low-resolution tests, Unreal Tournament 3 appears to provide a very minimal load on the high-end video cards we're testing. Thankfully there are several new games arriving to market late into 2008, so with some luck this benchmark will only be used for low-end graphics comparison into the future. Perhaps Devil May Cry 4 will be a suitable replacement.

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

Our last benchmark of the series is coming next, which puts our collection of video cards against some very demanding graphics with World in Conflict.

World in Conflict Results

The latest version of Massive's proprietary Masstech engine utilizes DX10 technology and features advanced lighting and physics effects, and allows for a full 360 degree range of camera control. Massive's MassTech engine scales down to accommodate a wide range of PC specifications, if you've played a modern PC game within the last two years, you'll be able to play World in Conflict.

World in Conflict's FPS-like control scheme and 360-degree camera make its action-strategy game play accessible to strategy fans and fans of other genres... if you love strategy, you'll love World in Conflict. If you've never played strategy, World in Conflict is the strategy game to try.

World in Conflict offers an in-game benchmark; which records the minimum, average, and maximum frame rates during the test. Very recently another hardware review website made the assertion that these tests are worthless, but we couldn't disagree more. When used to compare video cards which are dependant on the same driver and use the same GPU architecture, the in-game benchmark works very well and comparisons are apples-to-apples.

Based on the test results charted below it's clear that WiC doesn't place a limit on the maximum frame rate (to conserve wasted power) which is good for full-spectrum benchmarks like ours, but bad for electricity bills. The average frame rate is shown for each resolution in the chart below. At the lower 1280x1024 resolution most of our playing field is around 50-60 FPS. The GeForce 8800 GT, 9800 GTX+, and Radeon 4850 all share similar performance, while the Radeon HD 4870 and GTX 260 perform close as well.

World_in_Conflict_Benchmark.jpg

At 1680x1050 the Sapphire Radeon HD 4870 performs at 57 FPS while the GTX 260 holds at 59 FPS, while the others all float around the 45 FPS mark. The GeForce 9800 GTX+ finishes 1 FPS ahead of the Radeon HD 4850, leaving me with the feeling that overall they will each prevail by a small margin in their best tests.

With a balanced demand for CPU and GPU power, World in Conflict just begins to place demands on the graphics processor at the 1920x1280 resolution. I was expecting more results along the same line I've seen so far, and that is pretty much exactly what I got. The performance decay had very little impact on the high-level video cards: Radeon HD 4870 and GeForce GTX 260, which for all intents an purposes performed extremely well up to this point in our WiC testing.

Product Series ZOTAC GeForce 8800 GT AMP! ZT-88TES3P-FCP ZOTAC GeForce 9800 GTX+ ZONE ZT-98PES2P-WSP Sapphire Radeon HD 4850 102-B50102-00-AT Sapphire Radeon HD 4870 102-B50701-10-AT XFX GeForce GeForce GTX 260 GX-260N-ADDU
Stream Processors 112 128 800 800 192
Core Clock (MHz) 700 740 625 750 640
Shader Clock (MHz) 1700 1836 N/A N/A 1242
Memory Clock (MHz) 1000 1100 900 900 1150
Memory Amount 512 MB GDDR3 512 MB GDDR3 512 MB GDDR3

512 MB GDDR5

896 MB GDDR3
Memory Interface 256-bit 256-bit 256-bit 256-bit 448-bit

In our next section, we discuss electrical power consumption and learn how well (or poorly) each video card will impact your utility bill...

GTX+ ZONE Temperatures

Unlike many of the video cards we've tested here at Benchmark Reviews, the ZOTAC GeForce 9800 GTX+ ZONE Edition was a very different animal since the liquid cooling kept it well below temperatures we're used to seeing. Normally I would go into great detail and illustrate where a video card heats up the integrated components with use of a non-contact IR thermometer. But as you'll soon discover, there really aren't any "hot" spots on this graphics card.

The ambient room temperature was holding steady at exactly 20.0°C and the inner-case temperature hovered around 33°C. To begin my testing, I used ATITool v0.26 to record GPU temperatures at idle and again at high-power 3D mode.

At idle, the ZOTAC ZT-98PES2P-WSP recorded an extremely cool 31°C. This temperature in and of itself was enough to impress me, but don't think I wasn't still a little skeptical. Using a full 3D load for roughly twenty minutes, the ZONE Edition GeForce 9800 GTX+ would eventually peak at 57°C maximum temperature. In all honesty though, I see ways of pushing this temperature even lower.

Zotac_9800-GTX+_Zone_Kit_Angle.jpg

The cooling unit on the ZT-98PES2P-WSP is filled with tha blue-tinted liquid coolant, which is good since we don't want to discover tap water running through the unit. The fan however is standard low-output 120mm unit with decent airflow to noise ratio. If you're interested in adding a few extra CFM's to the radiator while keeping noise to a minimum, I suggest looking at SilenX IXTREMA Pro Series 120mm fan (SKU: IXP-74-14) which pushes 72 CFM at an ultra-low 14dBA. I really like these fans, more so than Noctuas NF-12P fan, because the center motor portion is much smaller in diameter thus allowing higher airflow. When I used this fan on the radiator, the temperature dropped another 3-4°C on average under load.

Of course, you could also use a high-output (and very high noise) fan such as the Yate Loon D12SH-12. If you disregard the noise levels, this is one of the best 120mm cooling fans available in regards to performance ratio. The D12SH-12 cooling fan forces an impressive 88 CFM of air at a moderately noisy 40 dbA. Personally, I can't suffer anything that produces higher sound levels than this, since gaming would then require headphones and casual computing is almost impossible. It rather negates the purpose behind silent liquid cooling, but it certainly offers another option for those overclockers looking for every last bit of performance.

In the next section, power consumption is measured and compared to many of the most recent graphic cards we've tested.

Power Consumption

It's becoming difficult to dodge the "doom and gloom" talk I hear a lot these days. Most would concede that planet Earth is needs our help, and I would probably say it needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning from white to brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers nearly every day, and get right to the point: your CPU has been doing a lot more to save the planet than your GPU has... until now.

To be honest, I have always wondered where people got the idea in their head that smaller fabrication process meant that there would automatically be more efficient power and lower heat output. Sure, the thermal envelope is contained into a smaller area, but if the clock speed is raised (as is the case with the GTX+), then temperature would be a moot point. Additionally, power efficiency doesn't just miraculously improve because the chip manufacturer shrinks the die. Good engineering will do that, not smaller engineering. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:

Video Card Power Consumption by Benchmark Reviews

VGA Product Description

(sorted by combined total power)

Idle Power

Loaded Power

NVIDIA GeForce GTX 480 SLI Set
82 W
655 W
NVIDIA GeForce GTX 590 Reference Design
53 W
396 W
ATI Radeon HD 4870 X2 Reference Design
100 W
320 W
AMD Radeon HD 6990 Reference Design
46 W
350 W
NVIDIA GeForce GTX 295 Reference Design
74 W
302 W
ASUS GeForce GTX 480 Reference Design
39 W
315 W
ATI Radeon HD 5970 Reference Design
48 W
299 W
NVIDIA GeForce GTX 690 Reference Design
25 W
321 W
ATI Radeon HD 4850 CrossFireX Set
123 W
210 W
ATI Radeon HD 4890 Reference Design
65 W
268 W
AMD Radeon HD 7970 Reference Design
21 W
311 W
NVIDIA GeForce GTX 470 Reference Design
42 W
278 W
NVIDIA GeForce GTX 580 Reference Design
31 W
246 W
NVIDIA GeForce GTX 570 Reference Design
31 W
241 W
ATI Radeon HD 5870 Reference Design
25 W
240 W
ATI Radeon HD 6970 Reference Design
24 W
233 W
NVIDIA GeForce GTX 465 Reference Design
36 W
219 W
NVIDIA GeForce GTX 680 Reference Design
14 W
243 W
Sapphire Radeon HD 4850 X2 11139-00-40R
73 W
180 W
NVIDIA GeForce 9800 GX2 Reference Design
85 W
186 W
NVIDIA GeForce GTX 780 Reference Design
10 W
275 W
NVIDIA GeForce GTX 770 Reference Design
9 W
256 W
NVIDIA GeForce GTX 280 Reference Design
35 W
225 W
NVIDIA GeForce GTX 260 (216) Reference Design
42 W
203 W
ATI Radeon HD 4870 Reference Design
58 W
166 W
NVIDIA GeForce GTX 560 Ti Reference Design
17 W
199 W
NVIDIA GeForce GTX 460 Reference Design
18 W
167 W
AMD Radeon HD 6870 Reference Design
20 W
162 W
NVIDIA GeForce GTX 670 Reference Design
14 W
167 W
ATI Radeon HD 5850 Reference Design
24 W
157 W
NVIDIA GeForce GTX 650 Ti BOOST Reference Design
8 W
164 W
AMD Radeon HD 6850 Reference Design
20 W
139 W
NVIDIA GeForce 8800 GT Reference Design
31 W
133 W
ATI Radeon HD 4770 RV740 GDDR5 Reference Design
37 W
120 W
ATI Radeon HD 5770 Reference Design
16 W
122 W
NVIDIA GeForce GTS 450 Reference Design
22 W
115 W
NVIDIA GeForce GTX 650 Ti Reference Design
12 W
112 W
ATI Radeon HD 4670 Reference Design
9 W
70 W
* Results are accurate to within +/- 5W.

The Sapphire Radeon HD 4870 X2 is clearly no stranger to high power bills based on the fact that it tops our chart for power consumption under load. Even at idle, the Sapphire Radeon HD 4870 X2 video card 100251SR gulps the watts down at a faster pace than the GeForce 9800 GX2 and single 4870 under maximum power conditions. Regardless of consumption, the power requirements for the Radeon HD 4870 X2 consist of only two six-pin PCI-Express power connections to ensure that the twin RV770 receives enough juice to push out the frames in 3D mode. This may leave some middle-market enthusiasts and lower-end gamers in search of a new power supply force feed the Radeon HD 4870 X2 the power it needs.

Like NVIDIA's GeForce 9800 GTX+, ATI showed the downside of their 55 nm RV770 GPU is its lack of energy efficient operation. Putting two together on the same PCB doesn't double the consumption of a single Radeon HD 4870, but it sure does try. The power consumption measured under full load doesn't match the performance, but it certainly matched heat output. The idle power draw is extremely high, which is uncommon since emphasis is usually placed on idle/standby mode efficiency and conservation.

Taken as a whole the idle stand-by power consumption is pretty unforgivable, especially since this the condition your equipment will be in the majority of the time. While loaded power consumption is the highest we've ever seen, the price paid to your utility company for gaming would be about the same with just about any other video card. Once upon a time, the computer and gaming consoles seemed like an inexpensive alternative to arcade gaming... but that was before energy costs soared through the roof.

Please continue to the review conclusion in the next section, where I share my final thoughts on ZOTAC's ZONE Edition and the GeForce GTX+ series product offerings.

ZONE Edition Final Thoughts

If you're reading this section, I congratulate and thank you. Not only does it mean you might actually care about ZOTAC's interesting product modification, but it also means you didn't just skip over the charts and call it done. I work too hard for these articles, and the words don't just write themselves. So moving on then...

Liquid-cooled video cards? Three years ago you might have scoffed that video cards hardly generate enough heat to warrant such an extreme cooling solution. In three more years, we might be saying the same thing. For now however, It's products like the Radeon HD 4870 that remind me how hot metal can get.

As an air-cooling fanatic, I find myself very reluctant to introduce any liquid-based hardware product that could kill my system. In reality though, I had to stop and consider how much damage a liquid-cooled video card could do if it sprung a leak. At first I considered the worst case scenario: motherboard devastation and attached component short-circuit. But realistically if a drip were to occur, it would simply create a small pool at the bottom of my case and not harm anything at all. Nevertheless, there are more than a few perks to be had with liquid-cooling; specifically with ZOTAC's ZONE Edition products.

Zotac_9800-GTX+_Zone_Kit.jpg

Take for instance several of the motherboards we've tested, namely the Gigabyte GA-EP45T-EXTREME and ASUS Striker II NSE. Both of these motherboard offer a Northbridge cooling component which allows the hardware enthusiast to connect into a liquid-cooling system. But what if you're like me, and don't want to deal with a whole system and the troubles of finding a home for the pump and reservoir? The ZONE Edition is the answer. I'm pretty sure ZOTAC wouldn't suggest it, but why not take about a dollars worth of rubber hose and tie-in the Northbridge? I think that's an idea worth exploring!

Taken all by itself, ZOTAC has an excellent solution to the growing heat-output problem with modern video cards. Some users aren't going to want to worry themselves with potential disasters, which is fine because ZOTAC still offers their standard and AMP! Edition products without the fuss. But for anyone who wants the benefits of liquid-cooling without the hassle of a complicated system, the ZONE Edition is really worth more than they're charging.

9800 GTX+ Conclusion

This isn't going to be easy... I would like to begin this section by admitting that I only like half of this product. Liquid cooling, while not being something I would normally be a part of, has actually become a very convenient cooling solution because of ZOTAC's design. I could easily add-in my Northbridge to the cooling loop with very little effort, allowing me to enjoy both a cool video card and extremely stable motherboard. The only problem: it's using the GeForce 9800 GTX+. Don't get me wrong - the GTX+ is a great product when you weigh it on its own. But when you take a moment to realize that it's built from the same engineering that put the 8800 GT on the map nearly two years ago you have to wonder if you're getting top performance from a new name. As our tests would indicate, you're not.

ZOTAC starts off with a very positive first impression. I like color, and the burnt orange hues used on the retail package are enough to catch my eye. Aside from sharp looks, they are also very good at keeping the consumer informed by adding important product details and specifications on the packaging. The retail box offers an inviting design and attractive layout, along with some product data on the back. Like the other ZOTAC products we have reviewed here at Benchmark Reviews, there is an underlying sense that they are in tune with the visual attraction a consumer has with products.

Once you unpack the ZT-98PES2P-WSP it become obvious that the ZONE Edition is not just another video card with the addition of a cooling radiator. The video card itself is extremely light, with any real weight going towards the liquid-filled cooling radiator. The black hoses are not entirely attractive, but they filter out all forms of light and thus reduce organism growth which in turn cuts maintenance down to zero. I know why ZOTAC painted the aluminum radiator black, but part of wishes they had picked another color. Lastly, my impression of the overall appearance is very positive, with very little to dislike in the ZONE Edition variation of NVIDIA's GeForce 9800 GTX+.

Zotac_9800-GTX+_Zone_Splash.jpg

In the not so distant past I was forced to replace my GeForce 8800 GTX video card because of an errant SATA cable swiped off an exposed capacitor. At that very moment, I felt that NVIDIA definitely should have done something more to protect the electronics on their product. Unlike the higher-end 8800 series GeForce products, the 9800 GTX+ does not expose any electronic components, and ZOTAC faithfully adheres to this standard with the ZONE Edition. NVIDIA has engineered the GeForce 9800 GTX to sustain above-average abuse, and since there are no exposed electronic components (with except to the back side of the PCB) there is very little chance that you'll have to RMA this product because it falls apart on you. The plastic shell covering the 9800 GTX+ will work very well in cramped environments where the video card will be in contact with cables and components, so long as it can fit and there's an open path for the hoses.

When I compare performance for any GeForce video card, I look to the other products that compete against it; both in the GeForce family and from competitor offerings. It amazed me to discover that the 9800 GTX+ is one of sixteen video card models currently offered by NVIDIA, and it holds the #4 position behind the GTX 280, GTX 260, and 9800 GX2. Perhaps this is why I feel like there's not too much to be impressed after testing the 9800 GTX: there are still twelve more video cards in the GeForce family trailing right behind it. I'm not going to get into the reasons why NVIDIA does what they do, but I am going to add that it doesn't make consumers as happy as their accountants. So in regards to performance and functionality of ZOTAC's GeForce 9800 GTX+, I personally feel that the core, shader, and memory clocks could have been higher, since overclocked versions of the 65nm 8800 GT maintain nearly identical specifications. The 9800 GTX+ was practically identical in performance to the Radeon HD 4850, making price the only major separating factor between the two.

ZOTAC has just launched the ZONE Edition GeForce 9800 GTX+ video card, which has subsequently dropped pricing for other related products. Pricing on the ZONE Edition is not yet available, but the AMP! Edition of the GeForce 9800 GTX is now selling at NewEgg for $187.99. It's encouraging that ZOTAC would offer liquid-cooling on a video card, and considering how easy it is to tie the video cards hoses into the newer Northbridge cooling systems which also feature liquid cooling, there's the potential for serious value if applied in the right combinations.

In conclusion, I feel that the ZOTAC ZONE Edition GeForce 9800 GTX+ has a lot to offer gamers and enthusiast than we might first expect. For most video cards functionality is measured in only one application: video games. However, the 9800 GTX+ can suit more than just one purpose. The ZOTAC GeForce 9800 GTX+ includes native HDMI video output and offers digital audio output through the attached S/PDIF audio cable, making this a fully-functional HDMI output solution for home theater buffs. Collectively rated, the G92 graphics processor offers full HDMI audio and video output, excellent cooling improvements, and triple-SLI functionality. I won't dispute that the results we recorded in most benchmarks were right in line with those of the Radeon HD 4850 which costs about the same, but then again ATI doesn't have a water-cooled version (and they need it). While value is a relative subject, the performance and functionality appear to have some credence in relation to the product cost. If you're a gamer on a very tight budget, the 8800 GT AMP! Edition sells for $119.99 and may be the best solution for you. But if you're considering DirectX 10 game play or you plan to use post processing effects like anti aliasing, the ZOTAC 9800 GTX/GTX+ series is a great future-ready solution.

Pros:Benchmark Reviews Silver Tachometer Award for Quality Recognition

+ Great AA/AF performance for hardcore gamers
+ Supports DirectX 10 and Shader Model 4.0
+ 740 MHz GPU / 1100 MHz GDDR3 vRAM
+ Integrated ZONE liquid-cooling solution
+ HDMI Audio and Video supported for HDCP output
+ Extremely quiet 120mm radiator fan under normal operation
+ Enables NVIDIA HybridPower technology
+ Easy to integrate other liquid-cooling components into the system
+ 16x Coverage Sampling Antialiasing (CSAA) algorithm
+ Supports triple-SLI functionality
+ Two-year ZOTAC warranty
+ 5 GBps PCI Express 2.0 graphics interface

Cons:

- Performs exactly like other G92-based products
- Expensive enthusiast product
- Cooling radiator requires space for mounting
- Large footprint full ATX form factor VGA space

Ratings:

  • Presentation: 9.00
  • Appearance: 9.00
  • Construction: 9.50
  • Functionality: 9.25
  • Value: 7.50

Final Score: 8.85 out of 10.

Quality Recognition: Benchmark Reviews Silver Tachometer Award.

Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.


Related Articles:
 

Comments have been disabled by the administrator.

Search Benchmark Reviews Archive