ZOTAC GeForce 9800 GTX 512MB Video Card |
Reviews - Featured Reviews: Video Cards | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Written by Olin Coles | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Tuesday, 01 April 2008 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ZOTAC GeForce 9800 GTXOn April 1st, 2008 NVIDIA will officially launch the GeForce 9800 GTX. It's been over sixteen months since the GTX series was last launched, and with such a successful debut of the 8800 GTX back in 2006 there is a lot of skepticism surrounding the new 9800 GTX. Enthusiasts may recall that the last time around NVIDIA launched their new generation of graphic cards with the GTX and GTS models, and later followed up with some mid-range offerings and one slightly faster "Ultra". This time around though, the playbook looked a lot different. First came the lower mid-level 9600 GT, and then the ultra-high level though, the playbook looked a lot different. First came the lower mid-level 9600 GT, and then the ultra-high level GeForce 9800 GX2 which utilized two G92 GPU cores. Benchmark Reviews has been fortunate enough to test the performance of ZOTAC's new GeForce 9800 GTX 512MB video card in this article, model ZT-98XES2P-FSP. It seems like it was just yesterday that I bought my first discrete graphics card to outfit an overclocked Cyrix M2-300 6x86MX-based computer. Back in those Windows 98 (first edition) days of 1998 the term GeForce wasn't even in existence yet, and NVIDIA was called referred to as nVidia. So when I bought my first computer late that year, I would have never thought Quake II played on my RIVA TNT2 AGP video card would mark the last time I would spend money in an arcade. This was nearly ten years ago and since that time NVIDIA has developed several successful GeForce product lines, including the newly launched 9th generation.
In the past, Benchmark Reviews has compared GeForce 8800 Graphics Performance: GT vs GTS vs GTX. In that article, it was shown that a more affordable 8800 GT could easily beat a heavily-overclocked 8800 GTS and close the gap with far more expensive 8800 GTX. Not much later we tested the ZOTAC GeForce 8800 GT AMP! Edition HDMI video card which in many tests performed very near to the more expensive 8800 GTX. But now that the 9800 series has its third product offering it seems as though the 8800 series is so... last generation. But don't think that the new name will somehow convince us that it will be an inherently better product; we still plan to test just how the new 9800 GTX fits into all of this. Since several of the former heavyweight products are now threatened with replacement by the new GeForce 9800 GTX, there seems to be a lot of concern as to how well it performs against the older 8800 GTX and Ultra which it supersedes. Gamers want to know if the GX2 is worth the money, or if they should wait. Making this decision a little more difficult is yet another change to the market. As if there wasn't enough competition already in the high-end segment of the 3D graphics market, on April 1st there will be one more addition to the 9thgeneration family named GeForce 9800 GTX.
Powered by the NVIDIA G92 graphics processor originally introduced with the GeForce 8800 GT series, the ZOTAC GeForce 9800 GTX video card takes the GeForce family one step higher. The new PCI Express 2.0 interface sends data at rates up to 5.0 GBps, which then uses the memory bus to build a 512 MB video frame buffer for smoother performance and realistic textures in PC games. The 1100 MHz GDDR3 video memory on the GTX communicates with the 675 MHz G92 graphics processor through a 256-bit memory interface. For an extra performance boost during intense gaming situations, NVIDIA has designed the GTX to offer 128 stream processors operating at 1688 MHz. Compared to the older PCI Express x16 bus which it replaces, the new PCI Express 2.0 interface delivers 5.0 GBps of graphical bandwidth which amounts to twice the data throughput over the previous generation. In the new generation of PCI Express 2.0 compatible motherboards, such as the Gigabyte's GA-X48T-DQ6 we used for testing, this new technology delivers bleeding edge graphics while remaining backwards compatible with older PCI Express x16 motherboards. ![]() Many hardware enthusiasts have already read the early leaked reviews surrounding the 9800 GTX, and have been asking some important questions about NVIDIA's newest product. Because the list of improvements is not exactly a major step up from previous products, gamers are wondering why they should make the move. Here's NVIDIA's answer to that question:
A few months back we reviewed ZOTAC's GeForce 8800 GT AMP! Edition HDMI video card, which used a DVI-to-HDMI adapter and S/PDIF audio input cable to stream full HDMI audio and video output for the first time in any NVIDIA product. Then just weeks ago the GeForce 9800 GX2was launched with the same HDMI functionality and features. HDMI is back again (although not a native interface) in the GeForce 9800 GTX. Benchmark Reviews will test the new ZOTAC GeForce 9800 GTX discrete graphics card (model ZT-98XES2P-FSP) against the most widely used NVIDIA products it competes against. Below is a chart with the most recent high-performance offerings from NVIDIA.
About the Company: ZOTAC International (MCO) Limited
ZOTAC International (MCO) Limited was established in 2006 with a mission to deliver superb quality of NVIDIA graphic solutions to the industry. It has strong backup from parent group, PC Partner Ltd. Headquartered in Hong Kong, factory in mainland China and regional sales offices in Europe, Asia Pacific and North America. The support ZOTAC provides is currently the largest of its kind around the world. With 40 SMT lines, 6,000 workers and 100,000 square-feet meter, ZOTAC features a full array of state-of-the-art facilities and machinery. In addition, ZOTAC has over 130 R&D professionals in Hong Kong, China and warranty and service center in strategic countries to enable effective and efficient worldwide as well as localized sales and marketing supports. ZOTAC with NVIDIA not only means superb quality, it also means high performance, absolute reliability and great value. In the past year, ZOTAC was compared and tested by several influential members in the media and have proven its products are good quality, worth-to-buy graphic cards in the market. With the product features of overclocked performance, excellent cooling properties and unique packaging ZOTAC products definitely exceed users' expectations. ZOTAC's commitment to our user is to bring the latest products quickly to the market with the best value. Doubtless to say ZOTAC is the right choice for those who require high-quality graphic solutions. For additional information please visit the ZOTAC website. GeForce 9800 GTX FeaturesBacked by NVIDIA'sLumenex Engine, the GeForce 9800 GTX delivers true 128-bit floating point high dynamic range (referred to as HDR), lighting capabilities with up to 16x full-screen anti-aliasing. Second-generation NVIDIA PureVideo HD technology with HDCP compliance delivers the ultimate high-definition video viewing experience to the ZOTAC GeForce 9800 GTX ZT-98XES2P-FSP graphics card. With hardware decoding for Blu-ray and HD DVD formats, PureVideo HD technology lowers CPU utilization when watching high-definition video formats by decoding the entire video stream in the graphics processor, freeing up the processor for other tasks. In addition to low CPU utilization, PureVideo HD enhances standard definition video content withde-interlacing and other post-processing algorithms to ensure standard DVD movies look their best on the PC screen and high-definition television sets. High definition content protection, or HDCP, technology ensures a secure connection between the GeForce 9800 GTX graphics card and an HDCP capable monitor for viewing protected content such as high-definition Blu-ray or HD DVD movies. Coupled withPureVideo HD technology, the 9800 GTX deliver the ultimate multimedia experience. HDMI technology allows users to connect PCs to high-definition television sets with a single cable, delivering high-definition surround sound audio and video with resolutions up to 1080p. PureVideo HD technology scales video in the highest quality up to resolutions of 2560x1600 - from standard and high-definition file formats - while preserving the details of the original content. PureVideo HD technology also accelerates high-definition video decode, freeing up CPU cycles while watching high-definition Blu-ray and HD DVD movies or other VC-1 and H.264 encoded file formats. NVIDIA Unified Architecture
NVIDIA Lumenex Engine
NVIDIA Quantum Effects Technology
NVIDIA Triple-SLI Technology
NVIDIA PureVideo HD Technology
Along with world-class video acceleration, PureVideo HD has been at the forefront of advanced video post-processing. With the R174 series driver, we are introducing new features for PureVideo HD for GeForce 9800 GTX. These new features, Dynamic Contrast Enhancement and Dynamic Blue, Green, and Skin Tone Enhancements, are extremely computationally intensive and not found on even the most high-end Blu-ray or HD DVD players. But by tapping into the enormous pool of computational power offered by our processor cores, we can now enable post-processing techniques that have yet to be realized in fixed-function video processors.
Advanced Display Functionality
Dynamic Color EnhancementBy analyzing the color components of each frame, we can also isolate and improve the appearance of blue, green, and skin tones, which the human eye is particularly sensitive to. Unlike televisions which have built-in image processors, PC monitors typically display the input picture without any processing, which can result in comparatively dull images. Dynamic blue, green, and skin tone enhancement alleviates this problem by applying correction curves on these sensitive colors. The result is improved total balance and clarity, without over saturation. Built for Microsoft Windows Vista
High Speed Interfaces
Operating Systems
API Support
NVIDIA Hybrid SLI TechnologyBenchmark Reviews learned of Hybrid SLI during our time with NVIDIA at the 2008 International CES. I thought that seeing the Stereoscopic 3D Gaming demonstration would be the highlight of NVIDIA's offerings, but then the following morning they proved to have at least one more trick up their sleeve. At CES we were privileged to see Hybrid SLI make it's formal release. NVIDIA announced the industry's first hybrid technology for PC platforms-Hybrid SLITM-that addresses two critical issues: increasing graphics performance and reducing power consumption. NVIDIA Hybrid SLI technology will be incorporated into a wide variety of graphics and motherboard desktop and notebook products that the Company is rolling out for both AMD and Intel desktop and notebook computing platforms throughout 2008. "From the introduction of programmable GPU's to the rapid adoption of our multi-GPU SLI technology, NVIDIA has repeatedly pioneered and innovated to solve difficult problems for the industry. We believe Hybrid SLI technology is one of the most important innovations we've come up with to date," said Jen-Hsun Huang, CEO of NVIDIA. "Hybrid SLI delivers new multi-GPU technology to a large segment of the PC market, delivering consumers a level of PC graphics performance and power efficiency never before seen."
First disclosed in June 2007, NVIDIA Hybrid SLI technology is based on the Company's market-leading GeForce graphics processor units (GPUs) and SLI multi-GPU technology. Hybrid SLI enables NVIDIA motherboard GPUs (mGPUs) to work cooperatively with discrete NVIDIA GPUs (dGPUs) when paired in the same PC platform. Hybrid SLI provides two new technologies- GeForce Boost and HybridPowerTM-that allow the PC to deliver graphics performance for today's applications and games when 3D graphics horsepower is required, or transition to a lower-powered operating state when not. NVIDIA HybridPower TechnologyFor lower energy consumption and quieter PC operation, HybridPower allows the PC to switch processing from a single GPU or multiple GPUs in SLI configuration to the onboard motherboard GPU. HybridPower is most useful in situations where graphics horsepower is not required, such as high definition movie playback on a notebook platform or simple e-mail or Internet browsing on a desktop. It is also beneficial for those users who want a quiet operating state with reduced thermals and noise. For notebooks, HybridPower can also dramatically extend battery life by up to 3 hours. When a game or application is started that requires the additional 3D horsepower, the PC can automatically transition back to the discrete graphics cards and power up the 3D capabilities all transparent to the end user. In applications where 3D performance is required, GeForce Boost turbo-charges 3D operation by combining the processing power of the traditional NVIDIA GeForce-based graphics card with that of the second GPU integrated into the motherboard core logic. In media-rich applications, bothGPUs work in tandem to render the combined images with the end user benefiting from the increase in performance and frame rate. For typical games and 3D applications, GeForce Boost can kick in automatically resulting in a greatly enhanced consumer experience.
When coupled with a HybridPower-enabled motherboard, the GeForce 9800 GTX can be power down completely. For everyday computing and watching HD movies, the motherboard GPU is used and the GeForce 9800 GTX can be turned off, consuming no power at all. When an intensive 3D application is engaged, users can turn on the GeForce 9800 GTX for maximum performance. HybridPower works by sending the output of the discrete GPU through the output connector on the motherboard. This allows the system to use bothGPUs as it sees fit without physically changing the connector.
NVIDIA is the recognized market leader for GPU desktop and notebook solutions for both Intel and AMD platforms and has a full lineup of Hybrid SLI-capable graphics and motherboard products planned for 2008. New Hybrid SLI-capable products include the upcoming NVIDIA nForce 780a SLI, nForce 750a SLI, and nForce 730a media and communication processors (MCPs) for AMD CPUs, which will be released next month, as well as the new GeForce 8200-the industry's first micro-ATX motherboard solution with an onboard Microsoft DirectX 10-compliant motherboard GPU. NVIDIA Hybrid SLI notebooks as well as desktop products designed for Intel CPUs will be available next quarter. Look for Hybrid SLI to make its way into everything NVIDIA produces from this point forward. {mosjrfooter GeForce 9800 GTX video card|geforce video card|} ZT-98XES2P-FSP SpecificationsCoupled withPureVideo HD technology, the ZOTAC GeForce 9800 GTX 512MB graphics card delivers an astounding multimedia experience. The GeForce 9800 GTX features two dual-link, HDCP-enabled DVI-I outputs for connection to analog and digital PC monitors and HDTVs, a 7-pin analog video-out port that supports S-Video directly, plus composite and component (YPrPb) outputs via an optional (and included) dongle.
Bus Support
3D Acceleration
Others
Dual-Stream DecodeRecently, studios have begun taking advantage of the additional space high-definition media such as Blu-Ray and HD DVD discs provide by adding dual-stream picture-in-picture functionality to movies. Often the PiP content is coupled with advanced BD-J (Java) or HDi (XML) features, so taking the processing burden off of the CPU is even more important for titles with these advanced features. The latest PureVideo HD engine now supports dual-stream hardware acceleration which takes the workload off of the CPU and gives it to the more powerful GPU. G92 Graphics Processing Unit
Memory
HDCP over dual-link allows video enthusiasts to enjoy high-definition movies on extreme high-resolution panels such as the 30" Dell 3007WFP at 2560 x 1600 withno black borders. The GeForce 9800 GTX also provides native support for HDMI output, using a certified DVI-to-HDMI adaptor in conjunction with the built-in SPDIF audio connector.
Aero with HD DVD and Blu-ray Playback
Until now, users have been unable to take advantage of the Aero user interface in Windows Vista while playing HD video. When this was attempted, Vista would revert back to a basic theme and Aero would be disabled. With HDMI support the 9800 GTX-based graphics solution is among the fastest graphics card available, and when paired with a 7 Series NVIDIA nForce motherboard, creates the latest in a line of powerful NVIDIA gaming platforms. Be blown away by scorching frame rates, true-to-life extreme HD gaming, and picture-perfect Blu-ray and HD DVD movies. {mosjrfooter GeForce 9800 GTX video card|geforce video card|} GeForce 9800 GTX Closer LookThe ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX uses a dual-slot design with improved clearance around the fan for optimal cooling and airflow. The board is cooled with an exceptionally quiet on-board "smart" fan; even when playing the most intensive 3D games, the GeForce 9800 GTX remained whisper quiet. Since I'm sure you just closely read through the myriad of features and specifications then you already know that you can use the 9800 GTX for something other than playing video games. The HDMI functionality is a new direction for NVIDIA graphic cards, and paired with the smart fan design and external exhausting ventillation the GeForce 9800 GTX will find itself at home in HTPC's too. ZOTAC's9800 GTX comes in one color: black. This little gem was tough to photograph, so don't be too upset with my images. In contrast to the equally dark Gigabyte 9800 GX2 we just tested, the new 9800 GTX looks a whole lot more exciting. It replaces the 8800 GTX (and Ultra), and measures exactly as long those video cards. The key difference surround the blower fan region, which is contoured to enhance airflow and reduce obstructions. ![]() Similar to the NVIDIA reference design, ZOTAC has sculpted their version of the GeForce 9800 GTX with the same overall appearance. Utilizing a glossy piano-black shell to encase the G92 GTX GPU, the delicate electronics inside are kept safe from accidental impact damage. I still feel the sting of a past incident where an accidental snag loosened a critical electronic component from the PCB of our older GeForce 8800 GTX in-between tests, which resulted in skewed results and some nasty fan mail. Thanks to the new well-conceived design those worries are all behind us now. ![]() The ZOTAC ZT-98XES2P-FSP graphics card is a performance optimized high-end card on every level. Power is taken from the PCI Express host bus as well as from two 6-pin PCI Express power connectors. Without any auxiliary power provided to the GeForce 9800 GTX graphics card, an LED on the bracket will shine red and the graphics card will not boot. In addition, the connector that is not adequately powered will turn red. Together this new functionality offers immediate feedback for enthusiasts concerned about providing adequate power to the GPU. In the past, low/no auxiliary power situations sounded a piezo buzzer which was so loud you could often mis-located the origin of the alarm.
Because the HDMI audio functionality is controlled at a hardware level, there is no need for additional drivers or software. Much like the SPDIF connection on the back of a motherboard, the video cards audio out function is plug-n-play. The P/SPDIF cable included with the kit connects between a small two-pin port on the power-connection side of the unit (near the green GeForce chevrons) and the HT Omega Claro Plus+ AD8620BR Op Amp sound card we used for testing. Your setup may be different, so the cable may connect between the 9800 GTX and either your motherboard or sound card digital input pins. Not all motherboards and sound cards support this option, so make sure it's available before you make your purchase. The 9800 GTX, unlike previous generation NVIDIA cards, is equipped with the PureVideo 2 engine for GPU assisted decoding of the H.264 and VC-1 CODEC's. ![]() In the image above you would be keen to notice twoSLI connections. NVIDIA has designed the GeForce 9800 GTX to operate in a 3-way SLI configuration, which they have tested to be faster than a pair of GeForce 9800 GX2 cards in Quad SLI in certain applications and resolutions For some applications, the GeForce 9800 GTX placed into a 3-way SLI set will be faster than a set of Quad SLI GeForce 9800 GX2's. The big question gamers and hardware enthusiasts will need answer for themselves is if their configuration is will support this functionality in terms of power supply, case, and cooling.
At this stage of early product release, ZOTAC doesn't offer any real difference from the NVIDIA reference design in their ZT-98XES2P-FSP product. In all actuality, both are virtually identical and offer the same GPU, GDDR-3, and shader clock specifications. Aside from the ZOTAC decal (which reminds me of Dural from the VirtuaFighter) on the top of the 9800 GTX, they are nearly one in the same. Since NVIDIA first launched the GeForce 9 series, it appears that they have restricted add-in card partners from offering overclocked products. Eventually I suspect that the market will either see a factory overclocked GTX if NVIDIA allows them, or they might just save there top binned GPU's and potentially announce the 9800 Ultra.
One of the primary problems encountered with the 8800 GTX and Ultra was the tight confinement it created inside the case which often times led to poor cooling. NVIDIA has redesigned the tail end of the 9800 GTX using contours to help open up air channels. The opening at the rear of the card is hollowed to allow supplemental cooling-air intake, resembling a jets intake manifold.
Please continue on to the next section where Benchmark Reviews takes a detailed look at the ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX 512MB video card. ZOTAC 9800 GTX Detailed FeaturesThe GeForce 9800 GTX (G92) GPU is manufactured using 65nm technology, employing a total of 754 million transistors, making it the most complex GPU ever created. Featuring 128 processors cores operating at 1688 MHz, the GPU pushes an astounding 432 GigaFLOPs. Each processor core is capable of being dynamically allocated to vertex, pixel, and geometry operations for the utmost efficiency in GPU resource allocation, and maximum flexibility in load balancing shader programs. Working alongside the processors cores are 64 texturing processors (eight texture processors per shader block) each capable of one addressing and filtering operation per clock. With a peak bilinearfill rate of 43.2 gigatexels, it offers unprecedented texturing performance for any GPU. The G92 chip features sixteen render back-end units (ROP) with full support for 128-bit high-dynamic-range rendering and NVIDIA's exclusive 16x Coverage Sampling Antialiasing (CSAA) algorithm. The ROP compression system has also been enhanced to improve performance at extreme resolutions such as 2560 x 1600. The enhanced compression will help keep memory usage in check and improve performance in high resolution, antialiased scenarios.
In our last section, we skimmed over the outer skin of the new GeForce 9800 GTX. With a basic understanding of what you'll get on the outside, we're ready to get inside the product and dissect the technology. This information will be very helpful for those hardware enthusiasts and overclockers willing to void their warranty and potentially ruin this expensive product in order to tweak it's electronics. This information is for entertainment purposes only, and not a recommendation to disassemble your product or perform modifications.
As I prepared to disassemble the shiny new ZOTAC 9800 GTX, I made note of the similarities between this PCB and the 8800 GT. Aside from the dual SLI expansion slots, the design appeared very close to the last generation; if anything there was very little PCB redesign needed on NVIDIA's part. Once I had carefully removed fourteen spring-loaded screws, three rearward set screws, and two plastic screws on the header panel, the ZOTAC GeForce 9800 GTX came apart with a light pull. The very first think I noticed was the absurdly large over-application of Thermal Interface Material. My thoughts raced back two years into the past when overheating Apple MacBooks were on every headline. Just to entertain myself I went back and searched out some of those old images, and in the end none of them looked as bad as the generous TIM application depicted below.
Since Benchmark Reviews has just completed extensive testing of 33 different Thermal Interface Materials, the proper application of TIM is a topic we could discuss for days. If anything, the images above and below are perfect examples of how NOTto apply Thermal Interface Material. This much material is probably 10x the amount suggested, and I'm rather astonished that a high-end name like ZOTAC could let something like this slip through their hands. Which brings up an alarming thought...
...What if ZOTAC was following NVIDIA directed assembly specification, and the amount of TIM used was per instruction? If this were the case, then all GeForce 9800 GTX video cards could be effected. That's the bad news. The good news comes near the end of this article, where we learn that the 9800 GTX runs on the cool side of temperatures. On a different note, the heatsink design is remarkably similar to the sandwiched version inside the GeForce 9800 GX2. With a slightly lower profile and identical impressions, it doesn't take much imagination to envision a PCB secured to each side of this cooling unit.
Once the PCB received a good cleaning, the 9800 GTX began to look a lot more respectable. Solid state capacitors and ferrite core chokes were surface mounted in the same fashion as we see Gigabyte tout in their Ultra Durable 2 design. If you're adventurous and decide to open up your own graphics card, which we do not recommend, then you should be careful around these capacitors. They may offer a longer product lifetime over standard capacitors, but they are also very easy to break away from the printed circuit board. ![]() Cleaning up the mess of Thermal Interface Material required the use of Arctic Silver's ArctiClean solution, and a whole lot of it. Once the G92 was "uncovered", I could get a few decent pictures. If anyone's keeping track of NVIDIA manufacturing code, here's the GPU inside the GeForce 9800 GTX which bares a G92-420-A2 identification.
754 Million transistors are etched through a 65nm process and all live inside the small one-inch square G92 GPU. A 256-bit memory bus allows the ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX to offer 512 MB to the G92 GPU, and gamers understand the importance of a fast video frame buffer. A total of eight Samsung GDDR3 modules line the outer perimeter of the printed circuit board, bearing the Samsung 807 K4J52324QE-BJ08 part number. Hardware enthusiasts should note that these samevRAM modules were also used on late-edition GeForce 8800 Ultra's.
This concludes our in-depth look into the ZOTAC ZT-98XES2P-FSP, which has revealed several interesting discoveries about the hardware and the assembly process. The 9800 GTX is a good-looking graphics card, but from here on out this product will have to put up some impressive results or be put down. In our next section, Benchmark Reviews begins testing on the GeForce 9800 GTX video card, but only after we spend some time explain how it's all done here in our lab. Video Card Testing MethodologyBenchmark Reviews has high hopes that one day we will be so giant and world famous that I will have multiple combinations of the graphic cards available and on-hand for our product testing period. I envy the review sites (all three of them) that have twenty other video cards tested in stand-alone, SLI, and CrossFireX arrays for each and every review. Readers can help us grow to that size as they spread the word, but for now we'll have to make due with what our budget can afford. In this article, Benchmark Reviews is going to test and compare the ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX 512MB graphics card against several other closely-ranked products from within the GeForce family. At the start of all tests, the previous display adapter driver is uninstalled and trace components are removed using Driver Cleaner Pro. We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. The synthetic benchmark tests in 3DMark06 will utilize shader models 2.0 and 3.0. Every test is conducted at the following resolutions: 1600x1200 (20.1/21" Standard LCD's), 1280x1024 (19" Standard LCD), and 1024x768 (17" Standard LCD). Each test programs will run after a system restart, and the very first benchmark for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict and Supreme Commander benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, with the average results displayed in our article. Our website statistics indicate that the over 90% of our visitors use their PC for playing video games, and nearly 70% of you are using one of the screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, I feel that this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. Since most gamers and enthusiasts are still using Windows XP, it was decided that DirectX9 would be used for all tests until game and driver support improve for Windows Vista.
Test Products
The new 9800 GTX replaces the 8800 GTX from the last generation of GeForce products. This makes it critical that we include the former in our tests, but just as important are the other products which often kept pace with the GTX. Among the most respected are the GeForce 8800 GT which shares the G92 GPU, and we're fortunate enough to have an overclocked ZOTAC AMP! Edition for that task. Just for good measure, Benchmark Reviews thought it would be interested to have the 9800 GX2 added into our test results. The GeForce 8800 GTS is the direct competition for the GeForce 8800 GT 512MB video card. Although NVIDIA released a new 256-bit version of the card, the older 320-bit version (offered with either 640 or 320MB) is still the most widely used video card by PC gamers today. Note that the default GeForce 8800 GTS core clock is 500MHz, while the FOXCONN GeForce 8800 GTS 640MB has been carefully overclocked to 600MHz. The standard GeForce 8800 GTS vRAM speed is 800MHz and has also been overclocked to 1030MHz while the shader clock remains at the standard 1200MHz.
Using the latest GPU-Z utility available for free from our affiliate website techPowerUp!, we verify manufacturers specifications with the actual internal specifications. In regard to the ZOTAC GeForce 9800 GTX, it appears that all specifications match those stated by ZOTAC. Benchmark Applications
Test System
3DMark06 Benchmark Results3DMark is a computer benchmark by Futuremark (formerly Mad Onion) to determine the DirectX performance of 3D game performance with graphics cards. 3DMark06 uses advanced real-time3D game workloads to measure PC performance using a suite of DirectX9 3D graphics tests, CPU tests, and 3D feature tests. 3DMark06 tests include all new HDR/SM3.0 graphics tests, SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today. Here at Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, I believe 3DMark is a very reliable tool for comparing graphic cards against one-another.
Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components. At the lower resolutions 3DMark will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, is helpful in measuring the maximum output performance in the test results. Using a 1024x768 resolution as a starting point the maximum settings were applied, which for these tests includes 8x Anti-Aliasing and 16x Anisotropic Filtering. Without question the GeForce 9800 GX2 outperforms every other competitor by a great margin of difference, as it should for a video card housing two G92 GPU's. Since this was the first test we ran on the 9800 GTX, it came as a surprise to see the older 8800 GTX outperform it in both tests. with a SM 2.0 score of 4439, the 9800 GTX was outperformed with a score of 4528 posted by the 8800 GTX giving it a 2% lead. The shader model 3.0 tests rendered a larger 9% advantage to the "outdated" 8800 GTX. Conversely, the ZOTAC GeForce 8800 GT 512MB AMP! Edition video card trailed behind the 9800 GTX by only 2% in SM 2.0 tests, and only 4% in the HDR/SM 3.0 tests.
Bumping the GPU strain up a notch with 1280x1024 resolutions the scores remain relatively comparable in terms of performance ratio. While Gigabyte's GeForce 9800 GX2 completely annihilates the competition with major-GPU muscle, MSI's factory overclocked GeForce 8800 GTX leads the remainder of the pack. The ZOTAC 8800 GT AMP! Edition maintains the same general performance ratio to the 9800 GTX as it displayed in the 1024x768 tests, but was largely outperformed by the 8800 GTX in the SM 3.0 tests. While the entire G92-based 9-series is PCI Express 2.0 compatible, the older G80-based GPU's are not. There doesn't seem to be any immediate advantage shown in our tests using the Gigabyte GA-X48T-DQ6 motherboard.
Finishing up the series of synthetic benchmark tests under heavy load, the overclocked MSI 8800 GTX narrowly defeats the ZOTAC 9800 GTX in shader model 2.0 testing. ZOTAC's 8800 GT AMP! Edition video card showed a prevailing strength against the aging GeForce 8800 GTS and was behind the GTX's by only 2%, but it tapered out in the more demanding SM 3.0 tests. In these 1600x1200 tests the AMP!'ed GT was outperformed the 8800 GTX by 11% in the high dynamic-range / shader model 3.0 tests, which also prevailed over the 9800 GTX by almost 7%. Already, things are not looking good for the new GeForce 9800 GTX. If you take the 3DMark06 tests at face value, the 9800 GTX struggles to match the product it replaces or beat out the other competitors. It also begins to look like the overclocked (AMP!'ed) 8800 GT may give the new GTX a run for the money. But in our next section we begin real-world testing on a cadre of popular video games known for taxing the graphics processor, and this lineup might change. Our first up is Call of Duty 4, so please continue on... Call of Duty 4 Benchmark ResultsCall of Duty 4: Modern Warfare runs on a proprietary game engine that Infinity Ward based off of the tried-and-true Q3 structure. This engine offers features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth of field. "Bullet Penetration" is calculated by the Infinity Ward COD4 game engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as the meager protection provided by things such as wooden fences and thin walls does not fully shield players from harm as it does in many other games released during the same time period. Bullet speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated. This version of the game also makes use of a dynamic physics engine, a feature which was not implemented in previous Call of Duty titles for Windows PC's. The new in-game death animations are a combination of pre-set static animations combined with ragdoll physics. Infinity Ward's use of the well-debugged Quake 3 engine along with new dynamic physics implementation allows Call of Duty 4 to be playable by a wide range of computer hardware systems. The performance may be scaled for low-end graphic cards up to 4x Anti-Aliasing and 16x Tri-linear anisotropic texture filtering.
Before I discuss the results, I would like to take a moment to mention my general opinion on Fraps software when it comes to game performance benchmarking. If you're not familiar with the software, Fraps (derived from Frames per second) is a benchmarking, screen capture, and real-time video capture utility for DirectX and OpenGL applications. Somereviewers use this software to measure video game performance on their Windows system, as well as record gaming footage. My opinion is that it offers a valid third-party non-bias alternative to in-game benchmarking tools; but there is one caveat: it's not perfect. Because the user must manually begin the test, the starting point may vary from position to position and therefore skew the results. In my testing with Fraps v2.9.4 build 7039, I used the cut-scene intro to the coup d'etat scene when Al Asad takes over control. First I allowed the level to load and let the scene begin for a few moments, then I would use the escape key to bring up the menu. Once I selected the restart level option, I would immediately press F11 to begin recording the benchmark data. This scene is nearly four minutes long, but I configured Fraps to record the first 180 seconds of it to remain consistent. Once the scene would end, I would repeat the restart process for a total of five tests. So within a 0.2 second starting point margin, all benchmark results are comparable which is probably as good as it can possibly get with this tool. In our frame rate results, all five of the collected test scores were within 0.75 FPS of one-another and then averaged for the chart you see above. Once the tests had been repeated and the results recorded, it was nice to see the new 9800 GTX finally outperformed the older 8800 GTX in a benchmark. Of course the differences are negligible at best, and the AMP!'ed 8800 GT is closely trailing in performance; which also raises the question about value. One look at the chart and you can also see that no other single graphics card can come close to offering the same level of performance as the GeForce 9800 GX2. It's difficult to deny how much power the factory-overclocked ZOTAC GeForce 8800 GT 512MB AMP! Edition video card offers at a reasonable value. When performance is compared to the 8800 GTX and 9800 GTX, it's easy to understand why so many gamers and enthusiasts flocked to purchase this graphics card. Alternatively, when we tested the GeForce 9800 GX2 in Call of Duty 4 it didn't come as a surprise to us that it's pair of G92 GPU's outperformed the other video cards with less brute force in their arsenal. In our next section, we shall see if the performance-demanding video gameCrysis will help strengthen this position. Crysis Benchmark ResultsCrysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX10) framework of Windows Vista, but can also run using DirectX9, both on Vista and Windows XP. Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading. Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources.
Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components. Even still, Crysis appears to have a preference for the new PCI Express 2.0 graphical interface found on G92 graphic processors; even if it only hints at it in our results. Even without Anti-Aliasing turned on, Crysis keeps the top four competitors around 60 FPS. It's clear that the CryENGINE2 is a heavy hitter, as the insanely overclocked ZOTAC 8800 GT AMP! Edition outperforms the GeForce 8800 GTS 640MB by over 26% and the 8800 GTX by almost 3%. The most surprising result of all is the 9800 GTX, which outperformed everything else at this resolution and leads the 8800 GTX by nearly 11%, 9800 GTX by 6%, and the 8800 GT by 8%. Even with more vRAM available to them, the older 8800 GTS and GTX just cannot offer the performance of the G92 GPU paired with the PCI Express 2.0 graphics bus. What comes as a surprise to me is how the single G92 9800 GTX was able to out-perform the two G92 GPU's inside the GeForce 9800 GX2, which did not shine so bright in this low-resolution test.
At the 1280x1024 resolution which mirrors what gamers using a 19" LCD monitor would experience, the results are still excellent but also beginning to near the 30 FPS threshold for acceptable performance. The aging G80-based GeForce 8800 GTS unit takes a pretty heavy hit, and drops clear out of the desired range. In terms of general performance, all of these products maintain the same performance ratio as before. This gives the 9800 GTX a 10% frame rate improvement over the 8800 GTX, but nowhere near the performance of the 9800 GX2 which is beginning to pull away from the pack with more than a 22% lead.
Surprisingly, the three GeForce 8800 series products maintained a rather constant performance ratio between one-another throughout the Crysis benchmark tests. While the ZOTAC GeForce 9800 narrowly outperformed the G80-based GTX by only one frame per second on average, the GeForce 9800 GX2 actually improved as the demand increased resulting in a 35% gain. But in the end, these were all tests with no post processing effects; and who buys a high-end GPU to play PC video games without AA anyway? Crysis offers a very wide range of settings beyond the basic "High" setting, which allows Anti-Aliasing steps of: No AA, 2x AA, 4x AA, 8x AA, 8x Q AA, 16x AA, and finally 16x Q AA. At the end of our real-world testing Crysis was given the GPU-thrashing 16x Q AA performance setting, and we watched the G92 youngsters run circles around the aging G80 generation. Most enthusiasts have lost sight of the fact that the G80 GPU doesn't offer the post processing memory optimizations and compression available to the new G92 series. This functionality makes all the difference when AA is turn up. Thanks to the AA optimizations in the G92, ZOTAC's GeForce 8800 GT AMP! Edition outperformed both G80 GPU's by a very significant margin despite all of the previous tests indicating a much closer disparity. It's obvious that the new 65nm G92 core architecture is to be credited, and perhaps the new PCI Express 2.0 interface which allows twice as much graphics data bandwidth also played its part. Then again, maybe MSI's factory overclocked 8800 GTX isn't built to keep pace when the post processing is maxed out. Either way, our benchmarks below certainly indicate that while the 8800 GTX matched the 9800 GTX in other tests without AA, but it doesn't even come close in a high-pressure Crysis.
With an incredible performance difference of 40% launching the 9800 GTX over the 8800 GTX, the difference between generations becomes clear. Even ZOTAC's 8800 GT AMP! Edition video card boast an 37% improvement over the more expensive 8800 GTX. Ultimate praise is due towards the unfathomable performance exerted from the 9800 GX2, which snatched a 38% over the G92 GTX. While the extra load did show an impact on the performance results, the all of the G92 video card walked over the competition with uncanny difference between them and the G80 versions. If you want to play Crysis will bells, whistles, and bag pipes, and you're not in a position to use an SLI array, then the 9800 GX2 is the obvious choice; but that ZOTAC 8800 GT AMP! Edition video card wins the price-to-performance nomination hands-down. EDITORS NOTE: After many months of using the Crysis demo for testing with the MadBoris Benchmark Tool, we recently started using the full retail version. Our initial tests have discovered that non-AA results were identical, but the 16x Q AA test produced very different test results. All of our previous results are still good for product comparison, but using the patched retail version of the video game (v1.21) has demonstrated that post processing effects (offered up to 16x Q AA) were not fully incorporated into the demo (limited to 8x AA). We have decided to address this matter in our Crysis performance comparison: demo vs retail forum discussion and welcome your comments. In our next section, Benchmark Reviews switches to video-output only benchmarking, and uses Lightsmark for an apples-to-apples comparison of performance. Lightsmark Frame RatesStepan Hrbek is the mastermind behind Lightmark 2007, a program that allows you to benchmark real-time global illumination. Natural lighting makes artificial graphics life-like and real. Computers get faster, but rendering more polygons doesn't add value if lighting still looks faked, so insiders know that the next big thing is proper lighting; aka Realtime Global Illumination. Typical workloads in real-time rendering will shift, and Lightsmark simulates it. Global Illumination renders often take hours, so is your computer fast enough for real-time? Before Lightsmark, real-time global illumination was limited to small scenes, small resolutions, small speeds, specially crafted scenes with handmade optimizations. Lightsmark breaks all limits at once, running in reasonably sized scene (220000 triangles) in high resolutions at excellent speed.
Lighting is computed fully automatically in an original unmodified scene from 2007 game World of Padman. This benchmark is not tweaked for Lightsmark, and contains all sorts of geometrical difficulties with extra rooms hidden below the floor.
This scene places medium to low demands on graphics cards and tests the maximum speed with which the scene can be properly displayed at each resolution. Similar to the low resolution tests, Lightsmark doesn't favor the goliath9800 GX2, or any particular GPU generation more than another. In fact, our GeForce 9800 GX2 was outperformed in every single Lightsmark test by the snappy AMP!'ed GT and 9800 GTX video cards.
After all of the Lightsmark tests were complete, I wasn't sure what to make of the results. Each test was performed with identical variables on the same day and in the same system. With the only difference being the time of day and the video card used, it's hard for me to understand how the GeForce 9800 GX2 was outperformed by the ZOTAC 9800 GTX let alone the lightweight of the bunch. At any rate, the 9800 GTX prevailed across the board in our tests, with a near 11% lead over the factory overclocked 8800 GTX. In the next section we utilize the lightweight Passmark 3D Graphics test to compare our test group of video cards. Passmark 3d Mark Results3D Graphics technology has come on in leaps and bounds over the last few years and this test measures how fast 3D images can be created and displayed. Microsoft provides a set of Application Programming Interfaces (APIs) called DirectX, which allow developers to create games and other high-performance multimedia applications. DirectX provides support for two-dimensional (2-D) and three-dimensional (3-D) graphics, sound effects, music, input devices, and networked applications such as multiplayer games. The Advanced 3D Graphics Test has been designed to benchmark the how well your video card performs when using the most common features of DirectX. It renders a number of spheres to the screen in windowed or full screen mode. As such, Performance Test requires DirectX version 9 or above. Apart from individual graphics card speeds and abilities, the test illustrates a single video card's drop in performance as the rendered scene becomes more complex. A scene with more objects, more textures and more DirectX features implemented may well look more impressive, but will more than likely result in a reduction in frame rate.
While I love using Passmark's Performance Test suite to benchmark system memory because of its consistent results, I am not as impressed with the light-load graphics test it offers to video cards. Sure, it runs through the various 3D chores a computer might encounter, but these aren't nearly as demanding as higher-end video games will require. The test does prove one theory I have developed, and that is the G80 GPU's are optimized for higher volume low-demand graphical calculations, whereas the G92-based GPU in the GeForce 9800 GTX and 9800 GX2 video cards is designed to eat up high-demand calculations at a faster rate thanks to NVIDIA's ROP compression system which has been enhanced to improve performance at extreme resolutions such as 2560 x 1600. It appears that the enhanced compression will help keep memory usage lower than previous G80 products, and help performance in high resolution antialiased scenarios. In the next section Benchmark Reviews gives a detailed look into the GPU performance in Supreme Commander: Forged Alliance. Supreme Commander: Forged Alliance ResultsSupreme Commander: Forged Alliance is a standalone real-time strategy computer game expansion to Supreme Commander, developed by Gas Powered Games and published by THQ. Because it is a standalone expansion, it is possible to play without owning Supreme Commander. Forged Alliance adds new game play features to the game, several new units for the three preexisting factions, and is further optimized for increased performance beyond that of the original game. Supreme Commander makes extensive use of two technologies relatively unused in video games prior to its release, namely multi core processing and multi monitor displays. When detecting a multi-core processor, the game assigns a specific task, such as AI calculations, to each core, splitting the load between them. Supreme Commander is one of the first games to specifically support dual and quad core processors in the game.
Supreme Commander: Forged Alliance may not offer the first-person shooter experience that many gamers prefer, but the graphics are among the most demanding possible. Even so, there begins to be a trend showing which places high demand on the graphics card as evidenced by mutually low minimum frame rates. The average frame rate showed that the G80-based 8800 GTX is still up for a fight, while performance was practically matched between the 8800 GT and the 9800 GTX. The only real stand-out was the GeForce 9800 GX2, which rendered the majority of frames much faster than the others.
Unlike World in Conflict, Supreme Commander: Forged Alliance does not use a short in-game benchmark to determine a score. In these tests, Supreme Commander plays an entire round of the game from start to finish and generates composite scores based on this lengthy test. This composite score is based on two factors: sim and render. The AMP!'ed 8800 GT couldn't catch the 8800 GTX, which also feel behind the 9800 GTX. Of course the 9800 bullied its way to the top of the SupComMark composite score. Supreme Commander proved to be a harsh gaming engine for video cards, and Crysis certainly applied some heavy pressure, but let's see how World in Conflict holds up against our performance testing in the next section. World in Conflict Benchmark ResultsThe latest version of Massive's proprietary Masstech engine utilizes DX10 technology and features advanced lighting and physics effects, and allows for a full 360 degree range of camera control. Massive's MassTech engine scales down to accommodate a wide range of PC specifications, if you've played a modern PC game within the last two years, you'll be able to play World in Conflict. World in Conflict's FPS-like control scheme and 360-degree camera make its action-strategy game play accessible to strategy fans and fans of other genres... if you love strategy, you'll love World in Conflict. If you've never played strategy, World in Conflict is the strategy game to try. World in Conflict offers an in-game benchmark; which records the minimum, average, and maximum frame rates during the test. Very recently another hardware review website made the assertion that these tests are worthless, but we couldn't disagree more. When used to compare video cards which are dependant on the same driver, the in-game benchmark works very well and comparisons are apples-to-apples.
First tested was the 1024x768 resolution in WiC, which relates the gamers using a 17" LCD monitor. Based on the test results charted above, it's clear that WiC doesn't place a limit on the maximum frame rate (to conserve wasted power) which is good for full-spectrum benchmarks but bad for electricity bills. The critically important minimum frame rate results indicate a shared lead to both the 8800 GT AMP! Edition and ZOTAC 9800 GTX video cards, which also carried over to higher average frame rates as well. A cautionary word about maximum frame rates is necessary, however. Although these readings are worth noting, the maximum frame rate is nearly worthless in determining GPU power. The reason for this is simple: those maximum frame rates are collected from scenes with little to no movement and practically no graphical processing demand. Obviously this shifts the importance over to the minimum frame rate, which will indicate how smooth the performance will remain under heavy demand.
With a balanced demand for CPU and GPU power, the 1280x1024 resolution proved to be the turning point for performance. MSI's overclocked 8800 GTX was beginning to gain its second wind, and recorded higher frame rates across the board than the Geforce 8800 GT and 9800 GTX that had previously beat it. Also notice how the GeForce 9800 GX2 posts minimum frame rates not very much higher than the others and barely above the 30 FPS mark, which proves that even under moderate demand World in Conflict is still incredibly demanding.
At the highest graphics quality settings the World in Conflict Masstech engine begins to really strain all of the GeForce products. At 1600x1200 resolution, none of these video cards can deliver a 30 FPS minimum frame rate, which is a little discouraging. Taking a broader look at the average frame rate, the overclocked MSI 8800 GTX still proves that it has game and maintains a credible advantage over the GeForce 9800 GTX and 8800 GT in all tests. Conversely, the GeForce 8800 GTX is barely able to nose out the 8800 GT with a meager 2 FPS advantage over the AMP!'ed 8800 GT in average frame rates. With mixed results rearing their ugly head again, World in Conflict seems to place the 8800 GTX ahead of the 8800 GT and 9800 GTX. The GX2 pulls well ahead of the pack in the more stressful tests; which was not the case in WiC's minimum frame rates. In the next section, Benchmark Reviews posts the operating temperature results and power consumption figures for the ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX 512MB video card. GeForce 9800 GTX TemperaturesThis section is probably the most popular for me, as a reviewer. Benchmark tests are always nice, so long as you care about comparing one product to another. But when you're an overclocker, or merely a hardware enthusiast who likes to tweak things on occasion, there's no substitute for good information. Benchmark Reviews has a very popular guide written on How To Overclock the NVIDIA GeForce 8800 Series, but it was published shortly after the 8th generation of GeForce products was launched. Currently we are preparing for a more updated article, with additional information on shader overclocking and temperature control as the newest 9th generation GeForce products are made available. Once published you can expect more detailed information than what is shown below, as for now the temperatures depicted are GPU core temperatures at idle and under load.
To begin my testing, I used ATITool v0.26 to record GPU temperatures at idle and again at high-power 3D mode. The ambient room temperature was a comfortable 19.0°C and the inner-case temperature hovered around 32°C. The ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX 512MB video card recorded 60°C in idle 2D mode, and increased to only 69°C in full 3D mode. Overall, this increase is negligible and it actually leaves a lot of room for potential overclocks. The most favored feature to the newly contoured design is the one-direction heat exhaust. Heated air recirculating around inside the computer case is no longer an issue like it is with the 8800 GT or 9800 GX2 because it gets exhausted away outside. While 69°C isn't terribly hot under full load, there's always room to make it operate cooler. But here's a little fact you probably didn't know (and won't read elsewhere): the G92 GPU is designed operate safely up to its 105°C thermal threshold. What happens after that? Believe it or not, if the GPU exceeds this temperature the clock speed will automatically be dialed down. This "speed stepping" technology ties into our discussion on power consumption which follows below. 9800 GTX Power ConsumptionPlanet Earth is needs our help, badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards becoming "green". I'll spare you the marketing hype that I get to sift through from the various manufacturers, and get right to the point: your CPU does a lot more to save the planet than your GPU is doing, but NVIDIA is working to change that problem. While current Intel central processing units are using a power-efficient 24nm die process size, the graphics processor is a bit slower to catch the technology curve and presently only shrinks to 65nmin the G92. NVIDIA states that the maximum TDP board power consumption for the GeForce 9800 GTX is 160 watts. Below is a chart with the total watts consumed by our specified test system:
In regard to power requirements, the GeForce 9800 GTX has the same hunger that haunted the older 8800 GTX and requires two 6-pin PCI-Express power connections for proper operation. NIVIDA has designed the G92 graphics processor to be an efficient cornerstone to the 9th Generation of GeForce products, which does inherently give it an efficiency advantage, but we are slow to hear of such proactive features like speed step scaling for the GPU. This technology exists in the architecture of the G92, and it may be a while before we can tell if NVIDIA's Enthusiast System Architecture can control this function. ESA is the NVIDIA's first open-standard PC monitoring and control protocol for real-time communication and control of system thermal, electrical, acoustic and operating characteristics. Ideally it would also be a nice idea to have Gigabyte's Dynamic Energy Saver (DES) functionality on a graphics card, so that performance was not interrupted when it was actually needed.
Compared to MSI's factory overclocked 8800 GTX, it should please you to learn that the ZOTAC ZT-98XES2P-FSP graphics card consumes 51 watts less power under high-power full 3D load. In fact, compared to our (extremely) overclocked G80-based GeForce 8800 GTS 640MB, the GeForce 9800 GTX consumes 3 watts less power under full load. At the other end of the spectrum, the GeForce 9800 GTX uses only 18 watts less than the 9800 GX2. This means that either the 9800 GTX isn't as energy efficient as it could be, or the 9800 GX2 is very efficient. Please continue to the review conclusion in the next section, where I share my thoughts on the ZOTAC ZT-98XES2P-FSP GeForce 9800 GTX and give my opinion of the new high-level offering. GeForce 9800 GTX Final ThoughtsAs of 1 April 2008 NVIDIA lists their top-to-bottom GeForce product line-up as follows below. With only the GeForce 8800 GTS follow-up product release scheduled in the next few months, you can expect that the 9800 GTX will remain in the #2 position of this ranking for quite some time. It really surprises me to see some of the older items still clinging to the list such as the 7300 GS, but every step in the ladder serves a purpose. This brings us to the purpose of the GeForce 9800 GTX.
So is the GeForce 9800 GTX worth the investment? Early rumors proved to be nonsense, with price estimates listing absurd amounts. However, since the actual launch price starts around $299, the motivation isn't all that hard to muster up. The ZOTAC GeForce 8800 GT 512MB AMP! Edition video card that we have repeatedly mentioned in this article wasn't by mistake; since it presently sells for $259.99 at NewEgg the price to performance ratio is almost the same as ZOTAC's vanilla GeForce 9800 GTX. The primary reasons to justify the 9800 GTX isn't the speed, especially since we've already proven that it's on par with older or less expensive offerings, but instead it's the functionality that makes all the difference:
Which leaves us with one final question to answer: is the 9800 GTX better than the version it replaces? This depends on your needs really, and the hardware you already have, but in most cases the answer is going to be "yes". There are specific lessons to be learned from the G92 architecture, especially when compared to the older G80. The primary reason to support my answer has already proven itself a reality; just take a look back at the Crysis benchmarks. If you're the low-demand gamer who doesn't use high resolution displays or enable post process effects such as anti-aliasing or anisotropic filtering, then the 9800 series is probably not your best investment. But if you'll notice from the tests, whenever post process effects were included at high resolution the 8800 GTX (and GX2) performed well ahead of the competition. NVIDIA's 9800 series products are squarely aimed at the upper high-end segment of performance gamers, and to most hard core enthusiasts there is more than enough value to realistically afford three units for a triple-SLI array.
But even still, let's pretend you're already using an G80-based 8800 series graphics card; because there are some additional benefits worth considering. To begin with, you're probably going to barely enjoy the latest video games on their lower post processing effects setting, with AA and AF turned down or off. Beyond this, you'll may also be unprepared for that day in the near future when DirectX 10.1 (or the upcoming DX11) resides on your operating system. Finally, there's the potential for using this HDMI-ready solution for something other than video games - such as a home theater PC. There's a longer list of reasons to support the justification in replacing an older G80 video card with the 9800 GTX; even beyond using it inside an HTPC for your home theater. Since the days of Battlefield 2 there haven't been very many games to seriously stress mid and high-performance video cards. The Battlefield 2142 was more of a lukewarm please-all with nearly no landscape to speak of, and until EA and Crytek GmbH came along withCrysis there hadn't been any major milestones to speak of for almost three years. Company of Heroes was (and to some players it still is) one of the most popular games of 2006, but its scalable Havok game engine allowed just about anyone with a personal computer to play the game. World in Conflict could very well be characterized as the CoH for 2007, especially since CoH: Opposing Fronts offered almost nothing new to gamers in regards to performance. WiC is equally scalable, but the large world-scape can have a greater impact on frame rate. In 2008 it appears that the Quake 3-based gaming engine in Call of Duty 4 is making headlines with superior game play and graphical delivery. When it comes down to PC video games, there are only a handful of titles that stand out more than those which I have tested here in this review. The important message is that the GeForce 9800 GTX can handle them all very well and delivers high frame rates across the board, right in step with its predecessor. If you're using a GeForce 8-series or older video card you may not prepared for the future of PC video games, which is already into DirectX 10.1 and quickly tuning the mechanics of DirectX 11. ZOTAC ZT-98XES2P-FSP ConclusionI might be a little too easy to please when it comes to retail packaging and graphics. I like color, but at the same time I want excitement. ZOTAC already has an advantage in that their color preferences align with some of my particular favorites. Since that's not enough to win me over, they are also very good at keeping the consumer informed by adding important product details and specifications on the packaging. The retail box offers an inviting design and attractive layout, along with some product data on the back. Like the other ZOTAC products we have reviewed here at Benchmark Reviews, there is an underlying sense that they are in tune with the visual attraction a consumer has with products. When Benchmark Reviews tested the GeForce 9800 GX2 the boxed-up NVIDIA reference design was not incredibly appealing to me. Apparently I just needed to wait for the 9800 GTX design before I would see curves influence the product appearance. While I never really considered the entire pre-G92 GeForce 8800 series to be very attractive as a whole, primarily because of the awkward half-covered products, the 9800 GTX has finished what was started. Unlike the past generation of products, this GeForce video card does not offer LED lights as accents because they are included as a functional indication of hardware status. In the not so distant past I have had to replace my GeForce 8800 GTX because of an errant SATA cable swiped off one of the capacitors. At that moment, I felt that NVIDIA definitely should have done something more to protect the electronics on their product. Unlike the higher-end 8800 series GeForce products, the 9800 GTX does not expose any electronic components. NVIDIA has engineered the GeForce 9800 GTX to sustain above-average abuse, and since there are no exposed electronic components (with except to the back side of the PCB) there is very little chance that you'll have to RMA this product because it falls apart on you. The plastic shell covering the 9800 GTX will work very well in cramped environments where the video card will be in contact with cables and components, so long as it can fit. In regards to performance and functionality, ZOTAC's GeForce 9800 GTX really does deserve its #2 position in NVIDIA's video card product lineup. Although I personally feel that the core, shader, and memory clocks could have been higher, the post process compression truly does optimize the 512 MB of video frame buffer for high-resolution gaming. It doesn't come as a huge surprise that the GeForce 9800 GTX outperforms the older 8800 GTX in 1600x1200 Crysis with 16x Q AA by over 40%, since the G92 was built for intense gaming effects. If that wasn't enough, this video card comes ready to support full HDMI audio and video output for your high definition copyright protected material. ZOTAC's ZT-98XES2P-FSP GeForce 9800 GTX has launched with a NewEgg price of $177.99. Considering that their least expensive 8800 GTX costs $339.99, the new ZOTAC 9800 GTX is starting to become more of a bargain. In conclusion, I feel that the ZOTAC GeForce 9800 GTX has a lot more to offer gamers and enthusiast than we might first expect. For most video cards functionality is measured in only one application: video games. However, in rare cases (this being one of them) the 9800 GTX can suit more than just one purpose. The ZOTAC GeForce 9800 GTX includes native HDMI video output and offers digital audio output through the attached S/PDIF audio cable, making this the closest thing to fully-functional native HDMI that NVIDIA offers. Collectively rated, the G92 graphics processor offers full HDMI audio and video output, nearly 40% performance improvement in Crysis over an already-overclocked 8800 GTX (when AA is maxed), excellent cooling improvements, and triple-SLI functionality. I won't dispute that the results we recorded in most benchmarks were right in line with those of the 8800 GTX when lower post processing effects were used, but then again we are just now seeing high-demand video games reach the market with newly developed core gaming engines. While value is a relative subject, the performance and functionality appear to have some credence in relation to the product cost. If you're a gamer on a very tight budget, than the 8800 GT may be the best solution for you. But if you're considering DirectX 10 game play or you plan to use post processing effects like anti aliasing, the ZOTAC 9800 GTX is a great future-ready solution. Pros:
+ Great AA/AF performance for hard core gamers Cons:
- Excessive Thermal Interface Material between GPU and heatsink
- Expensive enthusiast product
- Lower clock speeds than other G92 products - Large footprint full ATX form factor VGA space Ratings:
Final Score: 8.85 out of 10.Quality Recognition: Benchmark Reviews Silver Tachometer Award.Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
|
Comments
cpu: p4 3.00ghz
Mb: - p5vd2x asus
Ram: 2.5 G
thanks