Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 Video Card |
Reviews - Featured Reviews: Video Cards | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Written by Olin Coles | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Wednesday, 26 March 2008 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Gigabyte GeForce 9800 GX2It seems like it was just yesterday that I bought my first discrete graphics card to outfit an overclocked Cyrix M2-300 6x86MX-based computer. Back in those Windows 98 (first edition) days of 1998 the term GeForce wasn't even in existence yet, and NVIDIA was called referred to as nVidia. So when I bought my first computer late that year, I would have never thought Quake II played on my RIVA TNT2 AGP video card would mark the last time I would spend money in an arcade. This was nearly ten years ago and since that time NVIDIA has developed several successful GeForce product lines, including the newly launched 9th generation. On the 18th of March 2008 NVIDIA launched the GeForce 9800 GX2 to coincide with their 790i motherboard chipset. Because gamers were teased by NVIDIA's first 9-series release which barely satisfied the middle-market with their GeForce 9600 GT, the discussion as to which upcoming product would become the new king of the hill quickly became a heated topic. Since the 8th generation GeForce series launched with a monumental success, starring the still powerful 8800 GTX and 8800 GTS, most hardware enthusiasts come to expect the same level of awe in this launch of new generation of discrete graphics. Few enthusiasts would say NVIDIA has outdone themselves again, while most others will claim that they have disappointed the community. Benchmark Reviews ignores the chatter, and makes a solid case with the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card. The GeForce 9800 GX2 features 256 processor cores each independently operating at 1,500 MHz. Counting conservatively (2 flops per processor core), this amounts to an unprecedented 768 gigaflops of raw shading horsepower. In texturing performance, it can filter 76.8 billion pixels per second, or 190% more than the Radeon 3870 X2. In raw specifications across the board, it is vastly improved over its predecessor, the GeForce 8800 Ultra. But with a price of $599-$649, it launches at the same price as the GeForce 8800 GTX. With more than twice the shading power and a vastly improved PureVideo HD engine, the GeForce 9800 GX2 offers peerless 3D performance and great value for money.
In the past, Benchmark Reviews has compared GeForce 8800 Graphics Performance: GT vs GTS vs GTX. In that article, it was shown that a more affordable 8800 GT could easily beat a heavily-overclocked 8800 GTS and close the gap with far more expensive 8800 GTX. Not much later we tested the ZOTAC GeForce 8800 GT AMP! Edition HDMI video card which in many tests performed very near to the more expensive GTX. Several other product reviews from our affiliates have eventually discovered the same thing, and our collective results shook the market and announced an affordable high-end solution. But the 8800 is so... last generation, and now we have to determine how the 9800 GX2 fits into all of this. Since several of the former heavyweight products are now threatened with replacement by the new GeForce 9800 GX2, there seems to be a lot of concern as to how well the GX2 performs against the older 8800 GTX and Ultra which it supersedes. Gamers want to know if the GX2 is worth the money, or if they should wait. Making this decision a little more difficult is yet another change to the market. As if there wasn't enough competition already in the high-end segment of the 3D graphics market, on April 1st there will be one more addition to the 9th generation family named GeForce 9800 GTX. Powered by the NVIDIA G92 graphics processor originally introduced in the GeForce 8800 GT series, the newly released Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card takes gaming performance to an unbelievable level by packaging two GPU's into the same enclosure. By utilizing an integrated SLI configuration to combine the two independent GPU's, NVIDIA has made it possible to fit the equivalent of two 8800 GT's into the same form factor footprint as the 8800 GTX and Ultra. The reference core clock speed amounts to a modest 600 MHz on both GPU's, and GDDR3 memory is clocked to 1000 MHz. For an extra performance boost during intense gaming situations, NVIDIA has designed the GX2 to offer a total of 256 stream processors operating at 1500 MHz. The new PCI Express 2.0 interface sends data to the graphics card‘s combined 1024 MB of GDDR3 video memory for smooth performance and realistic textures in PC games. The 512MB of GDDR3 video memory on each PCB communicates with the host graphics processor through its own 256-bit memory interface, and combines the resources through the integrated PCI (Express) bridge. Compared to the older PCI Express x16 bus which it replaces, the new PCI Express 2.0 interface delivers 5.0 GBps of graphical bandwidth which amounts to twice the data throughput over the previous generation. In the new generation of PCI Express 2.0 compatible motherboards, such as the Gigabyte's GA-X48T-DQ6 we used for testing, this new technology delivers bleeding edge graphics while remaining backwards compatible with older PCI Express x16 motherboards.
Taken together as a whole, the GeForce 9800 GX2 signifies a major step up from the days NVIDIA last delivered a dual GPU video card with the 7950 GX2. With better scaling between graphics processors the SLI package embedded within the 9800 GX2 is far superior to the 7950 GX2, and more efficient to boot. Because the 9800 GX2 works within the ATX form factor compartmental confinements (right up to the edge in fact), this video card can be teamed together into an additional SLI configuration that equates to quad-SLI. Previously this kind of configuration (quad-SLI) would require massive 1200 watt output from an SLI certified power supply unit, but with the new 65 nm fabrication process this power demand has dropped to a more realistic level around 800 watts. But the list of improvements is still not complete (not by far). A few months back we reviewed ZOTAC's GeForce 8800 GT AMP! Edition HDMI video card, which used a DVI-to-HDMI adapter and S/PDIF audio input cable to stream full HDMI audio and video output for the first time in any NVIDIA product. Perhaps impressed with the idea, NVIDIA returned to the design boards and reconfigured their new 9800 series to offer the same functionality. The improvement with this generation is that the adapter is no longer necessary since there is now an HDMI port directly beside the two DVI connections, but the S/PDIF audio cable is still a separate input. Benchmark Reviews will test the new Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 discrete graphics card against the most widely used NVIDIA products it competes against. Below is a chart with the most recent high-performance offerings from NVIDIA.
About the company: Gigabyte United Inc. (G.B.T. Inc. USA)
Gigabyte United Inc., established in December 2006, is assuming the GIGABYTE TECHNOLOGY Co., Ltd. Brand, which for the past 20 years has been a world-renowned leader in the motherboard industry. Continuing to focus on its core businesses of GIGABYTE branded motherboards and graphics cards, Gigabyte United Inc. is committed to providing our valued customers with the highest quality products and services featuring the industry's most innovative design. In order to meet the challenges of today's intensely competitive channel market, Gigabyte United Inc. fully utilizes its key assets including its cutting-edge research and development team as well as its professional sales and marketing resources to continue to develop technologies to fit a complete range of digital life solutions. Now and for the future, Gigabyte United Inc. will continue to embody the unique spirit and culture, which has made Gigabyte one of the foremost brands in the industry. More information about Gigabyte is available by visiting their website. GeForce 9800 GX2 FeaturesBacked by NVIDIA's Lumenex Engine, the GeForce 9800 GX2 delivers true 128-bit floating point high dynamic range (referred to as HDR), lighting capabilities with up to 16x full-screen anti-aliasing. Second-generation NVIDIA PureVideo HD technology with HDCP compliance delivers the ultimate high-definition video viewing experience to the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 graphics card. With hardware decoding for Blu-ray and HD DVD formats, PureVideo HD technology lowers CPU utilization when watching high-definition video formats by decoding the entire video stream in the graphics processor, freeing up the processor for other tasks. In addition to low CPU utilization, PureVideo HD enhances standard definition video content with de-interlacing and other post-processing algorithms to ensure standard DVD movies look their best on the PC screen and high-definition television sets. High definition content protection, or HDCP, technology ensures a secure connection between the GeForce 9800 GX2 graphics card and an HDCP capable monitor for viewing protected content such as high-definition Blu-ray or HD DVD movies. Coupled with PureVideo HD technology, the 9800 GX2 deliver the ultimate multimedia experience. HDMI technology allows users to connect PCs to high-definition television sets with a single cable, delivering high-definition surround sound audio and video with resolutions up to 1080p. PureVideo HD technology scales video in the highest quality up to resolutions of 2560x1600 - from standard and high-definition file formats - while preserving the details of the original content. PureVideo HD technology also accelerates high-definition video decode, freeing up CPU cycles while watching high-definition Blu-ray and HD DVD movies or other VC-1 and H.264 encoded file formats. NVIDIA Unified Architecture
NVIDIA Lumenex Engine
NVIDIA Quantum Effects Technology
NVIDIA SLI Technology
NVIDIA PureVideoTM HD Technology
Advanced Display Functionality
Built for Microsoft Windows Vista
High Speed Interfaces
Operating Systems
API Support
NVIDIA Hybrid SLI TechnologyBenchmark Reviews learned of Hybrid SLI during our time with NVIDIA at the 2008 International CES. I thought that seeing the Stereosonic 3D Gaming demonstration would be the highlight of NVIDIA's offerings, but then the following morning they proved to have at least one more trick up their sleeve. At CES we were privileged to see Hybrid SLI make it's formal release. NVIDIA announced the industry's first hybrid technology for PC platforms-Hybrid SLITM-that addresses two critical issues: increasing graphics performance and reducing power consumption. NVIDIA Hybrid SLI technology will be incorporated into a wide variety of graphics and motherboard desktop and notebook products that the Company is rolling out for both AMD and Intel desktop and notebook computing platforms throughout 2008. "From the introduction of programmable GPU's to the rapid adoption of our multi-GPU SLI technology, NVIDIA has repeatedly pioneered and innovated to solve difficult problems for the industry. We believe Hybrid SLI technology is one of the most important innovations we've come up with to date," said Jen-Hsun Huang, CEO of NVIDIA. "Hybrid SLI delivers new multi-GPU technology to a large segment of the PC market, delivering consumers a level of PC graphics performance and power efficiency never before seen."
First disclosed in June 2007, NVIDIA Hybrid SLI technology is based on the Company's market-leading GeForce graphics processor units (GPUs) and SLI multi-GPU technology. Hybrid SLI enables NVIDIA motherboard GPUs (mGPUs) to work cooperatively with discrete NVIDIA GPUs (dGPUs) when paired in the same PC platform. Hybrid SLI provides two new technologies- GeForce Boost and HybridPowerTM-that allow the PC to deliver graphics performance for today's applications and games when 3D graphics horsepower is required, or transition to a lower-powered operating state when not. NVIDIA Hybrid SLI TechnologyFor lower energy consumption and quieter PC operation, HybridPower allows the PC to switch processing from a single GPU or multiple GPUs in SLI configuration to the onboard motherboard GPU. HybridPower is most useful in situations where graphics horsepower is not required, such as high definition movie playback on a notebook platform or simple e-mail or Internet browsing on a desktop. It is also beneficial for those users who want a quiet operating state with reduced thermals and noise. For notebooks, HybridPower can also dramatically extend battery life by up to 3 hours. When a game or application is started that requires the additional 3D horsepower, the PC can automatically transition back to the discrete graphics cards and power up the 3D capabilities all transparent to the end user.
For applications where 3D performance is required, GeForce Boost turbo-charges 3D operation by combining the processing power of the traditional NVIDIA GeForce-based graphics card with that of the second GPU integrated into the motherboard core logic. In media-rich applications, both GPUs work in tandem to render the combined images with the end user benefiting from the increase in performance and frame rate. For typical games and 3D applications, GeForce Boost can kick in automatically, resulting in a greatly enhanced consumer experience.
NVIDIA is the recognized market leader for GPU desktop and notebook solutions for both Intel and AMD platforms and has a full lineup of Hybrid SLI-capable graphics and motherboard products planned for 2008. New Hybrid SLI-capable products include the upcoming NVIDIA nForce 780a SLI, nForce 750a SLI, and nForce 730a media and communication processors (MCPs) for AMD CPUs, which will be released next month, as well as the new GeForce 8200-the industry's first micro-ATX motherboard solution with an onboard Microsoft DirectX 10-compliant motherboard GPU. NVIDIA Hybrid SLI notebooks as well as desktop products designed for Intel CPUs will be available next quarter. Look for Hybrid SLI to make its way into everything NVIDIA produces from this point forward. GeForce 9800 GX2 FeaturesCoupled with PureVideo HD technology, the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 graphics card delivers the ultimate multimedia experience. HDMI technology allows users to connect PCs to high-definition television sets with a single cable, delivering high-definition surround sound audio and video with resolutions up to 1080p. PureVideo HD technology scales video in the highest quality up to resolutions of 2560x1600 - from standard and high-definition file formats - while preserving the details of the original content. PureVideo HD technology also accelerates high-definition video decode, freeing up CPU cycles while watching high-definition Blu-ray and HD DVD movies or other VC-1 and H.264 encoded file formats.
Bus Support
3D Acceleration
Others
Dual-Stream DecodeRecently, studios have begun taking advantage of the additional space high-definition media such as Blu-Ray and HD DVD discs provide by adding dual-stream picture-in-picture functionality to movies. Often the PiP content is coupled with advanced BD-J (Java) or HDi (XML) features, so taking the processing burden off of the CPU is even more important for titles with these advanced features. The latest PureVideo HD engine now supports dual-stream hardware acceleration which takes the workload off of the CPU and gives it to the more powerful GPU. GV-NX98X1GHI-B Specifications:G92 Graphics Processing Unit
Memory
With two on-board GPUs, a GeForce 9800 GX2-based graphics solution is bar-none the fastest graphics card available, and when paired with a 7 Series NVIDIA nForce motherboard, creates the latest in a line of powerful NVIDIA gaming platforms. Be blown away by scorching frame rates, true-to-life extreme HD gaming, and picture-perfect Blu-ray and HD DVD movies.
GeForce 9800 GX2 Closer LookHere we are five pages into the article, and we're just now taking our first look at the new graphics card. Hopefully the specifications and myriad of features will interest you enough to use the 9800 GX2 for something other than video games, but we won't hold our breath. So without further delay, we can now concentrate our attention on the newly crowned king of the proverbial graphics performance hill. First, let's state the obvious... the 9800 GX2 looks like a brick. There it is, I said it. NVIDIA holds a wealth of knowledge in areas of both product design and industry marketing, yet for some reason their newest products are beginning to look increasingly less exciting. But rather than reminisce over the similarities between the 9800 GX2 and products like the 8800 Ultra, let's get into our up-close look at the Gigabyte GV-NX98X1GHI-B video card to see what it offers.
Similar to the NVIDIA reference design, Gigabyte has sculpted their version of the GeForce 9800 GX2 with the same overall appearance. Utilizing a metal shell to encase the double card-slot sized GX2, the delicate electronics inside are kept safe from impact damage. In the past, accidental snags have removed critical electronic components from the PCB of older GeForce products, but those worries are all behind us now.
The GV-NX98X1GHI-B graphics card is a performance optimized high-end card on every level. Power is taken from the PCI Express host bus as well as the 8-pin and 6-pin PCI Express power connectors. Without any auxiliary power provided to the GeForce 9800 GX2 graphics card, an LED on the bracket will shine red and the graphics card will not boot. In addition, the connector that is not adequately powered will turn red. Together this new functionality offers immediate feedback for enthusiasts concerned about providing adequate power to the GPU.
Look closely at the image above, and you'll notice a very small remenant resembling a PCI-Express interface blade at the top of the video card. On the opposite side of the 9800 GX2 is the working PCI-Express connection slot. ![]() The GeForce 9800 GX2 isn't the first graphics card to utilize the 65nm process, nor support the high-bandwidth PCI-Express 2.0 bus; but it is the fastest. The original credit belongs to the 8800 GT, which is actually at the core of the 9800 GX2's architecture. Aside from this similarity, everything else inside is very different in design.
Gigabyte offers few differences from the NVIDIA reference design in their GV-NX98X1GHI-B product. Both offer the same identical GPU, GDDR-3, and shader clock specifications, and both utilize the exact same SmartFan active cooling design. Aside from the colorful themed picture on the top and bottom of the 9800 GX2, they are nearly one in the same. Since the new GX2 was launched only days ago, it's not very surprising that most add-in card partners are following this trend of cloning the reference design. Eventually the market will see factory overclocked GX2, if NVIDIA allows them, but at this early stage they're non-existant.
The image above is of the "bottom" side of the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2. While it is practically identical in appearance to the "top" side of the video card, they are in face slightly different. At the bottom side of the card along the edge, there is a small removable plastic cover for the SLI bridge. Considering that the 9800 GX2 already uses two GPU's, this is makings for a quad-SLI configuration.
At the tail end of the Gigabyte 9800 GX2 there are two fasteners which are capped with adhesive rubber bumpers. The squared edges just barely keep the profile of this video card within ATX confinements, making the previous generations GeForce 8800 GTX and Ultra seem small comparison. Please continue on to the next section as Benchmark Reviews literally uncovers the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card for a closer inspection. GV-NX98X1GHI-B Detailed FeaturesIn our last section, we skimmed over the outer skin of the new GeForce 9800 GX2. With a basic understanding of what you'll get on the outside, we're ready to get inside the product and dissect the technology. This information will be very helpful for those hardware enthusiasts and overclockers willing to void there warranty and potentially ruin their expensive product in order to tweak it's electronics. This information is for informational purposes only, and not a recommendation to disassemble your product. Before we go and take apart our brand new 9800 GX2, let's revisit some of the finer functional points introduced with this product such as the HDMI audio output and power handling. Because the HDMI audio functionality is controlled at a hardware level, there is no need for additional drivers or software. Much like the SPDIF connection on the back of a motherboard, the video cards audio out function is plug-n-play. The P/SPDIF cable included with the kit connects between the small two-pin port on the 9800 GX2 and the HT Omega Claro Plus+ AD8620BR Op Amp sound card we used for testing. Your setup may be different, so the cable connects between the GX2 and either your motherboard or sound card digital input pins. Not all motherboards and sound cards support this option, so make sure it's available before you make your purchase. The 9800 GX2, unlike other NVIDIA cards, is equipped with the PureVideo 2 engine for GPU assisted decoding of the H.264 and VC-1 CODEC's. This in an important NVIDIA factoid that plays well into our own benchmarks later on in this article.
In regard to the new power requirements, the GeForce 9800 GX2 has two hungry mouths to feed so you can expect the consumption to be on a higher order. An eight-pin and six-pin PCI-Express power connections are both required to run the 9800 GX2. NIVIDA has designed the G92 graphics processor to be an efficient cornerstone to the 9th Generation of GeForce products. Compared to the 8800 GTX, it should please you to learn that the Gigabyte GV-NX98X1GHI-B graphics card consumes almost the same amount of power under high-power full 3D load. In comparison to our (extremely) overclocked G80-based GeForce 8800 GTS 640MB which consumes 72 additional watts of power when switching from low to high-power mode, the Gigabyte 9800 GX2 increases power demand by 83 watts. Alternatively, the the G92-based ZOTAC GeForce 8800 GT AMP! Edition only raises the level 59W under full load. EDITORS NOTE: Some add-in card partners have included power-connection adapters with GX2-series video card. Benchmark Reviews advises users NOT to use any connection adapters, as it may create a dangerous load on the power supply unit or electrical wall source.
As I prepared to disassemble the shiny 9800 GX2, there were a few subtle clues that gave away the inner workings of the Gigabyte GV-NX98X1GHI-B before I ever opened up the enclosure. Aside from the trimmed-off PCI-Express card slot on the opposite side of the GX2, there was the tell-tell sign of two NVIDIA GeForce 9800 GX2 products needing drivers loaded and then displayed in the Windows Device Manager.
I have read a lot of speculation claiming that the GeForce 9800 GX2 is a dual-GPU graphics card, and I believe that depending on the definition there might be some argument over the semantics. There are two GPU's inside the 9800 GX2 enclosure, so that much is true, but they are on two separate printed circuit boards which face towards each other and are then linked by a PCI (Express) bridge. This assembly creates two tightly-combined independent graphic cards into one singular video card package. It is however very different from the Radeon HD 3870 X2, which places two GPU's directly onto the same PCB. So essentially the GX2 is no more a dual-GPU unit than an Intel LGA775 socket consisting of a dual-core CPU. Again, it's semantics and how you define the technology. The up-side to this design is that each GPU dissipates heat onto its dedicated PCB. On a single-board design, both GPUs dissipate heat onto the same PCB, in effect transmitting heat to each other.
With two separate PCB's facing in towards each other, NVIDIA has designed a single blower fan using SmartFan technology to actively cool the two GPU's inside the 9800 GX2. While air is pulled in from the all five sides at the end of the graphics card, it is exhausted at each side of the unit and through a small opening of approximately one inch at the header end of the video card. Obviously it would be most ideal to have all heated air exhausted outside of the computer case, but because of the populated I/O header panel design a single dedicated exhaust air channel is not possible.
By design, the two printed circuit boards are pressed onto each side of the cast-aluminum heatsink sandwiched into the middle. Additional thermal conductive pads are strategically placed between key components such as DDR3 vRAM modules and the heatsink. Gigabyte also uses a pre-applied carbon-based Thermal Interface Material (TIM) between the GPU and the copper base inset into the heatsink. Even though NVIDIA ditched the heatpipes last seen in their 8800 GTS/GTX/Ultra reference design, it seems that there might be room for them if an add-in card partner wanted to improve upon the design.
A 256-bit memory bus allows the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 to offer 512 MB for each GPU, for a total realized video frame buffer of 1024 MB (1 GB). Gigabyte does not overclock this portion of the product in the GV-NX98X1GHI-B, primarily because of the delicate synchronization of the two PCB halves through the PCI bridge. Benchmark Reviews discovered another good reason, which we share later in our overclocking results section. If money's no object to you however, the 9800 GX2 could be great news for gamers and hardware enthusiasts wanting to connect two of these cards into a "quad" SLI array on one of the new nForce 790i motherboards.
This concludes our look at the Gigabyte GV-NX98X1GHI-B, and from here on out the 9800 GX2 must either put up results or be put down. In our next section, Benchmark Reviews begins testing on the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card after we spend some time explain how it's all done here in our lab. Video Card Testing MethodologyOne day, Benchmark Reviews will be so giant and world famous that I will have multiple combinations of the graphic cards available and on-hand to test in the same system during our product testing period. I envy the review sites (all three of them) that have twenty other video cards tested in stand-alone, SLI, and CrossFire X arrays for each and every review. Readers can help us grow to that size as they spread the word, but for now we'll have to make due with what our budget can afford. In this article, Benchmark Reviews is going to test and compare the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 against several other products from within the GeForce family. At the start of all tests, the previous display adapter driver is uninstalled and trace components are removed using Driver Cleaner Pro. We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. The synthetic benchmark tests in 3DMark06 will utilize shader models 2.0 and 3.0. Every test is conducted at the following resolutions: 1600x1200 (20.1/21" Standard LCD's), 1280x1024 (19" Standard LCD), and 1024x768 (17" Standard LCD). Each test programs will run after a system restart, and the very first benchmark for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict and Supreme Commander benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, with the average results displayed in our article. Our website statistics indicate that the over 90% of our visitors use their PC for playing video games, and nearly 70% of you are using one of the screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, I feel that this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. Since most gamers and enthusiasts are still using Windows XP, it was decided that DirectX 9 would be used for all tests until game and driver support improve for Windows Vista.
Using the GPU-Z utility available for free from our affiliate website techPowerUp!, we verify manufacturers specifications with the actual internal specifications. In regard to the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2, it appears that the GPU-Z is not quite sure what to think about the new statistics in a few key areas.
The GeForce 8800 GTS is the direct competition for the GeForce 8800 GT 512MB video card. Although NVIDIA released a new 256-bit version of the card, the older 320-bit version (offered with either 640 or 320MB) is still the most widely used video card by PC gamers today. Note that the default GeForce 8800 GTS core clock is 500MHz, while the FOXCONN GeForce 8800 GTS 640MB has been carefully overclocked to 600MHz. The standard GeForce 8800 GTS vRAM speed is 800MHz and has also been overclocked to 1030MHz while the shader clock remains at the standard 1200MHz. Benchmark Applications
Test System
Test Products
3DMark06 Benchmark Results3DMark is a computer benchmark by Futuremark (formerly MadOnion) to determine the DirectX performance of 3D game performance with graphics cards. 3DMark06 uses advanced real-time 3D game workloads to measure PC performance using a suite of DirectX9 3D graphics tests, CPU tests, and 3D feature tests. 3DMark06 tests include all new HDR/SM3.0 graphics tests, SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today. Here at Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, I believe 3DMark is a very reliable tool for comparing graphic cards against one-another.
Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components. At the lower resolutions 3DMark will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, is is helpful in measuring the maximum output performance in the test results. Using a 1024x768 resolution as a starting point the maximum settings were applied, which for these tests includes 8x Anti-Aliasing and 16x Anisotropic Filtering. Without question the GeForce 9800 GX2 outperforms every other competitor by a great margin of difference. Recording a SM 2.0 score of 5772 outperformed the GTX score of 3603 by almost 38%. Even the shader model 3.0 tests rendered a 37% advantage to the 9800 GX2. Conversely, the MSI GeForce 8800 GTX shows only a 5% lead over the AMP!'ed GT in SM 2.0 tests, and barely 11% in the HDR/SM 3.0 tests.
Bumping the GPU strain up a notch with 1280x1024 resolutions the scores remain relatively comparable in terms of performance ratio. While Gigabyte's GeForce 9800 GX2 completely annihilates the competition, the ZOTAC 8800 GT AMP! Edition maintains the same general performance ratio as it displayed in the 1024x768 tests, largely outperforming the GTS but still playing second-best to the GTX. While the entire G92-based 9-series is PCI Express 2.0 compatible, the older G80-based GPU's are not. There doesn't seem to be any immediate advantage shown in our tests using the Gigabyte GA-X48T-DQ6 motherboard.
Finishing up the series of synthetic benchmark tests under heavy load, Gigabyte's mighty 9800 GX2 prevails with a 40% performance lead over the overclocked MSI 8800 GTX. ZOTAC's 8800 GT AMP! Edition video card showed a prevailing strength against the aging GeForce 8800 GTS in the shader model 2.0 tests, but it tapered out in the more demanding SM 3.0 tests. In these 1600x1200 tests the AMP!'ed GT was outperformed the GTX by 11% in the high dynamic-range / shader model 3.0 tests, but the GX2 cleared the GTX by over 38%. If you take the 3DMark06 tests at face value, the 9800 GX2 obviously out-classes the other competitors, and the 8800 GT falls right between the GTS and GTX. But in our next section we begin real-world testing on a cadre of popular video games known for taxing the graphics processor, and this lineup might change. Our first up is Call of Duty 4, so please continue on... Call of Duty 4 Benchmark ResultsCall of Duty 4: Modern Warfare runs on a proprietary game engine that Infinity Ward based off of the tried-and-true Q3 structure. This engine offers features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth of field. "Bullet Penetration" is calculated by the Infinity Ward COD4 game engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as the meager protection provided by things such as wooden fences and thin walls does not fully shield players from harm as it does in many other games released during the same time period. Bullet speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated. This version of the game also makes use of a dynamic physics engine, a feature which was not implemented in previous Call of Duty titles for Windows PC's. The new in-game death animations are a combination of pre-set static animations combined with ragdoll physics. Infinity Ward's use of the well-debugged Quake 3 engine along with new dynamic physics implementation allows Call of Duty 4 to be playable by a wide range of computer hardware systems. The performance may be scaled for low-end graphic cards up to 4x Anti-Aliasing and 16x Tri-linear anisotropic texture filtering.
Before I discuss the results, I would like to take a moment to mention my general opinion on Fraps software when it comes to game performance benchmarking. If you're not familiar with the software, Fraps (derived from Frames per second) is a benchmarking, screen capture, and real-time video capture utility for DirectX and OpenGL applications. Some reviewers use this software to measure video game performance on their Windows system, as well as record gaming footage. My opinion is that it offers a valid third-party non-bias alternative to in-game benchmarking tools; but there is one caviat: it's not perfect. Because the user must manually begin the test, the starting point may vary from position to position and therefore skew the results. In my testing with Fraps v2.9.4 build 7039, I used the cut-scene intro to the coup d'etat scene when Al Asad take over control. First I allowed the level to load and let the scene begin for a few moments, then I would use the escape key to bring up the menu. Once I selected the restart level option, I would immediately press F11 to begin recording the benchmark data. This scene is nearly four minutes long, but I configured Fraps to record the first 180 seconds of it to remain consistent. Once the scene would end, I would repeat the restart process for a total of five tests. So within a 0.2 second starting point margin, all benchmark results are comparable which is probably as good as it can possibly get with this tool. So now for the results of my hard work: just taking one look at the chart above an you can see that no other single graphics card can come close to offering the same level of performance as the Gigabyte GeForce 9800 GX2. Because our test sample 790i motherboard has not yet arrived here at Benchmark Reviews HQ, we optioned to test single-card products on the Gigabyte GA-X48T-DQ6 motherboard. In our frame rate results, all five of the collected test scores were within 0.75 FPS of one-another and averaged for the chart. There has been a few questions raised in regards to the ZOTAC GeForce 8800 GT 512MB AMP! Edition video card performance compared to the 8800 GTX. We covered this comparison once, and like other web sites in the industry we found that a reference GT could come close to the performance of a mildly overclocked GTX in certain tests. But later when we reviewed the AMP! GT we found the results to be alarmingly close, while sometimes beating the GTX in others. So when we test the GX2 in Call of Duty 4 it doesn't really surprise us too much to see it outperforming the older GPU's with less brute force in its arsenal. In our next section, we shall see if the performance-demanding video game Crysis will help strengthen this position. Crysis Benchmark ResultsCrysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX 10) framework of Windows Vista, but can also run using DirectX9, both on Vista and Windows XP. Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading. Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources.
Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components. Even still, Crysis appears to have a preference for the new PCI Express 2.0 graphical interface; even if it only hints at it in our results. Even without Anti-Aliasing turned on, Crysis keeps the top three competitors around 60 FPS. It's clear that the CryENGINE2 is a heavy hitter, as the ZOTAC 8800 GT AMP! Edition outperforms the GeForce 8800 GTS 640MB by over 26% and the GTX by almost 3%. Even with more vRAM available to them, the older 8800 GTS and GTX just cannot offer the performance of the G92 GPU paired with the PCI Express 2.0 graphics bus. What comes as a surprise to me is how close the single G92 8800 GT can come to matching the performance of the two G92 GPU's inside the 9800 GX2, which did not shine so bright in this low-resolution test.
At 1280x1024 resolution, the results are still excellent but nearing the 30 FPS performance threshold for acceptability for the aging G80-based GeForce 8800 GTS unit. In terms of performance, all products maintain the same performance ratio, which still gives the 8800 GT a small frame rate improvement over the GTX, but nowhere near the performance of the 9800 GX2 which is beginning to pull away with more than a 15% lead.
Surprisingly, the three GeForce 8800 series products maintained a rather constant performance ratio between one-another throughout the Crysis benchmark testing, while the 9800 GX2 actually improved as the demand increased. At the end of our real-world testing Crysis was given the GPU-thrashing 16x Q AA performance setting, and we watched the G92 youngsters run circles around the ailing G80 generation. ZOTAC's GeForce 8800 GT AMP! Edition outperformed both G80 GPUs by a significant margin despite all of the previous tests indicating a much closer disparity. Perhaps the new G92 core architecture is to be credited, or the new PCI Express 2.0 interface which allows twice as much graphics data bandwidth. Or perhaps MSI's 8800 GTX is barely more than an overclocked GTS. Either way, our benchmarks certainly indicate that while the GTX beat the AMP!'ed GT in the other tests, it doesn't come close in a high-pressure Crysis.
Boasting even more success than the 8800 GT is the unfathomable performance exerted from the 9800 GX2. While the extra load did show an impact on the performance results, the dual-G92 video card walked over the competition with almost 42% difference between it and the next closest competitor (the 8800 GT). If you want to play Crysis will bells, whistles, and bag pipes, and you're not in a position to use an SLI array, then the 9800 GX2 is a clear winner. EDITORS NOTE: After many months of using the Crysis demo for testing with the MadBoris Benchmark Tool, we recently started using the full retail version. Our initial tests have discovered that non-AA results were identical, but the 16x Q AA test produced very different test results. All of our previous results are still good for product comparison, but using the patched retail version of the video game (v1.21) has demonstrated that post processing effects (offered up to 16x Q AA) were not fully incorporated into the demo (limited to 8x AA). We have decided to address this matter in our Crysis performance comparison: demo vs retail forum discussion and welcome your comments. In our next section, Benchmark Reviews switches to video-output only benchmarking, and uses Lightsmark for an apples-to-apples comparison of performance. Lightsmark Frame RatesStepan Hrbek is the mastermind behind Lightmark 2007, a program that allows you to benchmark real-time global illumination. Natural lighting makes artificial graphics life-like and real. Computers get faster, but rendering more polygons doesn't add value if lighting still looks faked, so insiders know that the next big thing is proper lighting; aka Realtime Global Illumination. Typical workloads in real-time rendering will shift, and Lightsmark simulates it. Global Illumination renders often take hours, so is your computer fast enough for real-time? Before Lightsmark, real-time global illumination was limited to small scenes, small resolutions, small speeds, specially crafted scenes with handmade optimizations. Lightsmark breaks all limits at once, running in reasonably sized scene (220000 triangles) in high resolutions at excellent speed.
Lighting is computed fully automatically in an original unmodified scene from 2007 game World of Padman. This benchmark is not tweaked for Lightsmark, and contains all sorts of geometrical difficulties with extra rooms hidden below the floor.
This scene places medium to low demands on graphics cards and tests the maximum speed with which the scene can be properly displayed at each resolution. Similar to the low resolution tests, Lightsmark doesn't favor the goliath 9800 GX2. In fact, our GeForce 9800 GX2 was outperformed in every single test by the snappy ZOTAC GeForce 8800 GT AMP! Edition video card.
After all of the Lightsmark tests were complete, I wasn't sure what to make of the results. Each test was performed with identical variables on the same day and in the same system. With the only difference being the driver version (in the GX2) and video card, it's hard for me to understand how the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 was outperformed by the lightweight of the bunch. In the next section we utilize the lightweight Passmark 3D Graphics test to compare our test group of video cards. Passmark 3d Mark Results3D Graphics technology has come on in leaps and bounds over the last few years and this test measures how fast 3D images can be created and displayed. Microsoft provides a set of Application Programming Interfaces (APIs) called DirectX, which allow developers to create games and other high-performance multimedia applications. DirectX provides support for two-dimensional (2-D) and three-dimensional (3-D) graphics, sound effects, music, input devices, and networked applications such as multiplayer games. The Advanced 3D Graphics Test has been designed to benchmark the how well your video card performs when using the most common features of DirectX. It renders a number of spheres to the screen in windowed or full screen mode. As such, Performance Test requires DirectX version 9 or above. Apart from individual graphics card speeds and abilities, the test illustrates a single video card's drop in performance as the rendered scene becomes more complex. A scene with more objects, more textures and more DirectX features implemented may well look more impressive, but will more than likely result in a reduction in frame rate.
While I love using Passmark's Performance Test suite to benchmark system memory because of its consistent results, I am not as impressed with the light-load graphics test it offers. Sure, it runs through the various 3D chores a computer might encounter, but these aren't nearly as demanding as higher-end video games will require. The test does prove one theory I have developed, and that is the G80 GPU's are optimized for higher volume low-demand graphical calculations, whereas the G92-based Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card is designed to eat up high-demand calculations at a faster rate. Put into a comparative analogy, the G80 is a pickup truck running through fences, and the G92 is a dump truck powering through buildings. In our lightweight Passmark 3D Graphics test, the 9800 GX2 rendered the worst results of the bunch. You'll notice this has been a reoccurring theme throughout our testing, where the low resolution performance is not a broad as the high resolution tests. In the next section Benchmark Reviews gives a detailed look into the GPU performance in Supreme Commander: Forged Alliance. Supreme Commander: Forged Alliance ResultsSupreme Commander: Forged Alliance is a standalone real-time strategy computer game expansion to Supreme Commander, developed by Gas Powered Games and published by THQ. Because it is a standalone expansion, it is possible to play without owning Supreme Commander. Forged Alliance adds new game play features to the game, several new units for the three preexisting factions, and is further optimized for increased performance beyond that of the original game. Supreme Commander makes extensive use of two technologies relatively unused in video games prior to its release, namely multi core processing and multi monitor displays. When detecting a multi-core processor, the game assigns a specific task, such as AI calculations, to each core, splitting the load between them. Supreme Commander is one of the first games to specifically support dual and quad core processors in the game.
Unlike World in Conflict, Supreme Commander: Forged Alliance does not use a short in-game benchmark to determine a score. In these tests, Supreme Commander plays an entire round of the game from start to finish and generates composite scores based on this lengthy test. This composite score is based on two factors: sim and render.
Supreme Commander: Forged Alliance may not offer the first-person shooter experience that many gamers prefer, but the graphics are among the most demanding possible. Even so, there begins to be a trend showing which places high demand on the graphics card as evidenced by mutually low minimum frame rates. The average frame rate showed promise for the GeForce 9800 GX2, while the 8800 GT and GTX matched their performance output. Supreme Commander proved to be a harsh gaming engine for video cards, and Crysis certainly applied some heavy pressure, but let's see how World in Conflict holds up against our performance testing in the next section. World in Conflict Benchmark ResultsThe latest version of Massive's proprietary Masstech engine utilizes DX10 technology and features advanced lighting and physics effects, and allows for a full 360 degree range of camera control. Massive's MassTech engine scales down to accommodate a wide range of PC specifications, if you've played a modern PC game within the last two years, you'll be able to play World in Conflict. World in Conflict's FPS-like control scheme and 360-degree camera make its action-strategy game play accessible to strategy fans and fans of other genres... if you love strategy, you'll love World in Conflict. If you've never played strategy, World in Conflict is the strategy game to try. World in Conflict offers an in-game benchmark; which records the minimum, average, and maximum frame rates during the test. Very recently another hardware review website made the assertion that these tests are worthless, but we couldn't disagree more. When used to compare video cards which are dependant on the same driver, the in-game benchmark works very well and comparisons are apples-to-apples.
First tested was the 1024x768 resolution in WiC. Based on the test results charted above, it's clear that WiC doesn't place a limit on the maximum frame rate (to conserve wasted power) which is good for full-spectrum benchmarks but bad for electricity bills. The critically important minimum frame rate results indicate a decisive lead to the ZOTAC GeForce 8800 GT AMP! Edition video card, which later tapered down behind the Gigabyte GeForce 9800 GX2 and overclocked MSI 8800 GTX respectively. A cautionary word about maximum frame rates is necessary, however. Although these readings are worth noting, the maximum frame rate is nearly worthless in determining GPU power. The reason for this is simple: those maximum frame rates are collected from scenes with little to no movement and practically no graphical processing demand. Obviously this shifts the importance over to the minimum frame rate, which will indicate how smooth the performance will remain under heavy demand.
With a balanced demand for CPU and GPU power, the 1280x1024 resolution proved to be the turning point for performance. While the MSI 8800 GTX was not powerful enough to outperform the 9800 GX2, the average frame rate was only outperformed by 10 FPS for a 17% performance advantage. Also notice how the GeForce 9800 GX2 posts minimum frame rates not very much higher than the others and barely above the 30 FPS mark, which proves that even under moderate demand World in Conflict is incredibly demanding.
At the highest graphics quality settings the World in Conflict Masstech engine begins to really strain all of the GeForce products. At 1600x1200 resolution, none of these video cards can deliver a 30 FPS minimum frame rate, which is a little discouraging. Taking a broader look at the average frame rate, the overclocked MSI 8800 GTX maintains a more substantial disadvantage over the GeForce 9800 GX2 with a 16 frame per second difference. The GeForce 8800 GTX still proves that it has game with an 5 FPS advantage over the AMP!'ed 8800 GT. Much like 3DMark06, World in Conflict seems to place the GT firmly between the GTS and GTX. Yet unlike the other tests, the GX2 usually pulled well ahead of the pack in the more stressful tests; which was not the case for WiC. Please continue onto the next section where we discover that overclocking the GX2 may not be in your best interest. 9800 GX2 Overclocking ResultsAny other graphics card that we wanted to overclock would be put through the steps outlined in our Guide To Overclocking the NVIDIA GeForce Series. But this is no ordinary video card; this is the GeForce 9800 GX2, which mounts two independent GPU's together by a PCI bridge for a single self-contained SLI array. Because of this configuration, it is impossible to flash individual BIOS information to each GPU without modification. Because of this, we must rely on third-party software to accomplish our overclocking tasks. Returning to our affiliate techPowerUp! to download the free overclocking utility ATITool, which works on all brands of GPU. Once we established our standard operating temperatures at idle and under load prior to changes, we would have a good idea of the tolerances once the overclocking began. Once our trial and error attempts to squeeze more performance out of the 9800 GX2 were complete, we discovered some incredible results. Using ATITool we overclocked the Gigabyte GV-NX98X1GHI-B GPU's from 600 MHz up to a very-stable 715 MHz, which yield an incredible 115 MHz gain. The GDDR-3 vRAM was stretched just as far, moving from 1000 MHz to 1115 MHz. While it should be considered common knowledge, the results we attained may not be similar to results of your own experiments. Benchmark Reviews does not recommend that you risk damage to your product by overclocking the speed beyond the manufacturers default settings. Furthermore, neither Benchmark Reviews, Gigabyte, nor NVIDIA will honor any product warranty damage claims as a result of overclocking. ![]() The smaller resolution used in our benchmarks are fine for most tests, but for our overclock results we will concentrate on the 1600x1200 resolution. In 3dMark06 we found that our overclocked 9800 GX2 performed 11.14% better in the shader model 2.0 tests, and 12.31% better in the HDR tests. While not much of a difference between the two tests, it seems that the 9800 GX2 offers better performance in the newer and more demanding tests. We'll see if this is true in the remaining benchmarks.
Running the Crysis demo at 1600x1200 with high-quality settings and 16x Q AA enabled, the 9800 GX2 is going to be earning its keep. Our overclocked results were only 6.33% better than stock, resulting in a meager 3 frames per second improvement. The upside is that regardless of the extreme settings, the 9800 GX2 performs above the acceptable frame rate range of 30 FPS.
Lightsmark is great when you need precision comparisons. Tested at 1600x1200, the GX2 barely offered an advantage. Although a 16 FPS improvement seems decent, it only amounts to 4.34%.
During our benchmarks of Supreme Commander: Forged Alliance, we discovered that many of our tests indicated a frame rate decline with the overclocked 9800 GX2. The minimum frame rate result is the most important, and it shows 1 FPS dropped, while the average remained the same. Somehow the GX2 posted an additional frame per second in the impractical maximum frame rate.
One of the few tests that indicated potential gain was Passmarks Performance Test. The 3D Graphics benchmark runs through three different test scenes with different resolutions and screen activity. The final score is represented as a Graphics Mark. While it's not much to talk about, the overclocked GX2 did offer a whopping 10% improvement.
Similar to Supreme Commander, our experience with World in Conflict mirrored the previous results. The important minimum frame rate remained the same, but the average dropped 2 FPS. Again, the maximum frame rate improved for low-motion scenes with an additional two frames. Even despite an additional 115 MHz to both the GPU and vRAM speeds, our overclocking efforts proved inconclusive. Adding a few extra frames onto the maximum frame rate is completely worthless, unless of course you place value on a paused or low-graphics scene. The most critical frame rate is the minimum, and our overclocked 9800 GX2 actually lost frames. For now, I don't have a validated explanation for this. Perhaps someone can offer some tested insight to this in our Discussion Forum. In the next section, Benchmark Reviews posts the operating temperature results and power consumption figures for the Gigabyte 9800 GX2 video card. GeForce 9800 GX2 TemperaturesThis section is probably the most popular for me, as a reviewer. Benchmark tests are always nice, so long as you care about comparing one product to another. But when you're an overclocker, or merely a hardware enthusiast who likes to tweak things on occasion, there's no substitute for good information. Benchmark Reviews has a very popular guide written on How To Overclock the NVIDIA GeForce 8800 Series, but it was published shortly after the 8th generation of GeForce products was launched. Currently we are preparing for a more updated article, with additional information on shader overclocking and temperature control as the newest 9th generation GeForce products are made available. Once published, you can expect more detailed information than what is shown below, as for now the temperatures depicted are GPU core temperatures at idle and under load.
To begin my testing, I used ATITool v0.26 to record GPU temperatures at idle and again at high-power 3D mode. The ambient room temperature was a comfortable 20.0°C and the inner-case temperature hovered around 32°C. At default speeds the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 video card recorded 59°C in idle 2D mode, and increased to 78°C in full 3D mode. After recording the GPU temperatures at stock settings, I measured the temperatures as reported by ATITool once overclocked. With the extra heat added from the overclock, the 9800 GX2 rested at 62°C in idle 2D mode, and increased to 80°C in full 3D mode. Overall, this increase in negligible. While heated air still an issue because it gets re-circulated back into the case, there is a small portion that is exhausted away outside. While 80°C isn't terribly hot, it's not nearly as cool as we saw temperatures in the ZOTAC GeForce 8800 GT AMP! Edition which is based on the same GPU. Even so, recirculated air is still exactly that, and the unfortunate truth is that it's exhausted into an area close to the expansion slot mounting plate, where very few fans can cool. Fortunately for me however, the Lian Li PC-B20A Aluminum Mid-Tower ATX Case I used to record these results has the BS-03 kit (included optional patent cooling slot kit) which mounts directly in front of the exhaust port on the graphics card and draws the heated air directly out of the case. 9800 GX2 Power ConsumptionPlanet Earth is needs our help, badly. With forests becoming a barren of vegitation and snow capped poles quickly browning, the technology industry has a new attitude towards becoming "green". I'll spare you the marketing hype that I get to sift through from the various manufacturers, and get right to the point: your CPU does a lot more to save the planet than your GPU is doing, but NVIDIA is working to change that problem. While current Intel central processing units are using a power-efficient 24nm die process size, the graphics processor is a bit slower to catch the technology curve and presently only shrinks to 65nm in the G92. Below is a chart with the total watts consumed by our specified test system:
NIVIDA has designed the G92 graphics processor to be an efficient cornerstone to the 9th Generation of GeForce products, which does inherently give it an efficiency advantage, but we are slow to hear of such features like speed step scaling for the GPU. It would also be a nice idea to have Gigabyte's Dynamic Energy Saver (DES) functionality on a graphics card. Please continue to the review conclusion in the next section, where I share my thoughts on the Gigabyte GV-NX98X1GHI-B GeForce 9800 GX2 and give my opinion of the new high-level king. GeForce 9800 GX2 Final ThoughtsWith the introduction of NVIDIA's new GeForce 9800 GX2 top-end graphics card, the deck has been shuffled once again and gamers have a new tool available to help improve their performance. This will undoubtedly cause quite a stir since the older 8800 GTX and Ultra offer far less performance but still keep the high price tag. While this product series is squarely aimed at the upper high-end performance segment, there is more than enough value to see some enthusiasts purchase two units for a quad SLI array. Some (but not many) gamers once stepped up to the overpriced GeForce 8800 Ultra only to find themselves disappointed by performance on par with an overclocked GTX. That's not happening this time around, and the new 9800 GX2 is delivering on the promises it makes.
For most video cards, functionality is measured in only one application: video games. However, in rare cases (this being one of them) the video card can suit more than just one purpose. The GeForce 9800 GX2 includes native HDMI video output and offers digital audio output through the attached S/PDIF audio cable, making this the closest thing to fully-functional native HDMI that NVIDIA offers. Since the days of Battlefield 2 there haven't been very many games to seriously stress mid and high-performance video cards. The Battlefield 2142 was more of a lukewarm please-all with nearly no landscape to speak of, and until EA and Crytek GmbH came along with Crysis there hadn't been any major milestones to speak of for almost three years. Company of Heroes was (and to some players it still is) one of the most popular games of 2006, but its scalable Havok game engine allowed just about anyone with a personal computer to play the game. World in Conflict could very well be characterized as the CoH for 2007, especially since CoH: Opposing Fronts offered almost nothing new to gamers in regards to performance. WiC is equally scalable, but the large world-scape can have a greater impact on frame rate. In 2008 it appears that the Quake 3-based gaming engine in Call of Duty 4 is making headlines with superior game play and graphical delivery. When it comes down to PC video games, there are only a handful of titles that stand out more than those which I have tested here in this review. The important message is that the GeForce 9800 GX2 eats them up and spits out nothing less than the highest frame rates possible. Gigabyte GV-NX98X1GHI-B ConclusionGigabyte has convinced me that all products should be as colorful as theirs. The retail box offers an inviting design and attractive layout, along with some product data on the back. Like the other Gigabyte products we have reviewed, there is an underlying sense that they are in tune with the visual attraction a consumer has with products. It's responsible of them to add important consumer details on the packaging to help make informed purchases. When Benchmark Reviews tested the GA-X48T-DQ6 motherboard I was surprised by the myriad of colors popping out at me. With the GV-NX98X1GHI-B though, I think that the NVIDIA reference design had more influence over Gigabyte's product appearance. While I never really considered the entire pre-G92 GeForce 8800 series to be very attractive as a whole, primarily because of the awkward half-covered products, the 9800 GX2 has only slightly improved upon the general appearance with clean corners and flat surfaces. Unlike the past generation of products, this GeForce video card does not offer LED lights as accents because they are included as a functional indication of hardware status. In the not so distant past I have had to replace my GeForce 8800 GTX because of an errant SATA cable swiped off one of the capacitors. At that moment, I felt that NVIDIA definitely should have done something more to protect the electronics on their product. Unlike the higher-end 8800 series GeForce products, the 9800 GX2 does not expose any electronic components; it goes one further and encases them in a metal chassis. Gigabyte has engineered the GV-NX98X1GHI-B GeForce 9800 GX2 to sustain above-average abuse, and since there are no exposed components there is very little chance that you'll have to RMA this product because it falls apart on you. The fully enclosed 9800 GX2 will work very well in cramped environments where the video card will be in contact with cables and components, so long as it can fit. In regards to performance and functionality, high praise is due to the GeForce 9800 GX2 video card. It has been over one year since NVIDIA last crowned a new performance king, and now the GX2 can replace the 8800 Ultra on the throne. With a combined total of 1 GB video frame buffer and two high-power G92 GPU's operating at 600 MHz, it doesn't come as a huge surprise that the GeForce 9800 easily outperforms every other individual video card by over 33%. If that wasn't enough, this video card comes ready to support full HDMI audio and video output for your high definition copyright protected material. Even though the GeForce 9800 GX2 product launch is only a few days old, there is plenty of inventory available to retailers. At the time of this writing, the Gigabyte GV-NX98X1GHI-B has just reached the market and is available at NewEgg for $549.99. With some additional searching this product was also at ClubIT for $549.99. Prices on the GeForce 9800 GX2 are expected to slowly drop as the product becomes more commong on the market. PRICE UPDATE 20 November 2008: Many retailers have stopped selling the GeForce 9800 GX2 video card, however you can still find bargain prices as low as $299 for this product using our price comparison tool. In summary, the Gigabyte GeForce 9800 GX2 graphics card offers full HDMI audio and video output, nearly 40% performance improvement in Call of Duty 4 and almost 63% in Crysis over an already-overclocked 8800 GTX, good cooling, and "quad" SLI potential. While value is a relative subject, the performance and functionality appear to have some credence in relation to the product cost. If you're a gamer on a budget, look forward to the upcoming 9800 GTX. But If you're building an an extreme gaming system, the Gigabyte GV-NX98X1GHI-B 9800 GX2 is going to be the the graphics platform for you. Pros:
+ Excellent performance for extreme gamers Cons:
- Heatsink does not completely exhaust outside the case Ratings:
Final Score: 8.75 out of 10.Quality Recognition: Benchmark Reviews Silver Tachometer Award.Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
|