Archive Home arrow Guides arrow How To Overclock the NVIDIA GeForce 8800 Series

How To Overclock the NVIDIA GeForce 8800 Series E-mail
Articles - Featured Guides
Written by Olin Coles   
Sunday, 18 February 2007
Table of Contents: Page Index
How To Overclock the NVIDIA GeForce 8800 Series
Programming the BIOS
Cooling Improvements

Overclocking the GPU

EDITORS NOTE: This instructional article is available as an archived reference. The updated replacement is titled: Overclocking the NVIDIA GeForce Video Card and is now available. The new article will receive periodic updates and become part of a larger series focused at optimizing, tuning, and overclocking the computer system.

Overclocking can take on many forms, and the practice can range from minor product improvements to a total re-engineering project which completely alters the hardware. In this article, I will concentrate my efforts towards achieving the most gain with the least amount of effort.

Some industry voices have called overclocking a hobby, while others have compared it to product misuse. However, I believe that if you are reading this article, you are probably one of the many computer enthusiasts who believe it is perfectly acceptable to get something more out of a product without it costing more money. When I think about it, everyone enjoys getting something for nothing; it's human nature.

Additionally, it is human nature to blame someone else if something goes wrong. This is where I warn you, the reader of this article, that neither I nor recommend that you overclock your video card. This article explains how I conducted these experiments on my own property. and the author of this article will not be responsible for damages or injury resulting from experiments you choose to conduct on your own property. If you read beyond this point, you are accepting responsibility for your own actions and hold Olin Coles and the staff of harmless.

NVIDIA has recently released their GeForce 8800 Ultra video card, which is now the crown jewel in gaming performance. However, computer hardware and gaming enthusiasts have learned is really just an overclocked GeForce 8800 GTX. Although this guide teaches you how to overclock all NVIDIA video cards, I have selected the FOXCONN NVIDIA GeForce 8800 GTS as my test subject. Most all of the older GeForce 6 and 7 generation video cards may also be overclocked using the same methods discussed here. In this how-to article you will learn how to turn your GeForce 8800 GTX into an Ultra, or you can overclock any NVIDIA video card for better performance; and do it all for free!

 FOXCONN GeForce 8800 GTS Top View

Presently, the 640MB version of the GeForce 8800 GTS is the third-best video card available on the market, after the Ultra and GTX. Hardcore gamers and computer enthusiasts alike have already speculated on how the GTS could be made to perform to the same level as the GeForce 8800 GTX with some tweaking. Unfortunately, this just isn't possible because of architecture. What is possible though is taking a great product and making perform better; which is exactly what I did.

By default, this particular FOXCONN 8800 GTS operates with a 575MHz G80 GPU core clock speed, a 1188MHz shader, and a 900MHz (1800MHz) GDDR3 RAM speed. I will utilize the free overclocking utility ATITool to search out the best clock speeds and simulate heavy graphic loading to establish stability. I will then make use of NiBiTor which is another free tool to program a custom flashable video BIOS. After creating the new custom video BIOS with NiBiTor, I will use yet another free application, nvFlash, to program the new custom video BIOS onto the GeForce 8800 GTS.

 ATITool 3D View

Sure the name says "ATI" Tool, but the author has made this a great tool for both ATI and NVIDIA products for several years now. I have personally used many different tools for overclocking in the past, but this has proven to be the best tool for all my needs. After a straightforward installation and reboot (to complete the installation of the driver-level service), I open ATITool and see a lot of options confronting me. Try not to be overwhelmed, since all of these options could strike fear in the hearts of the inexperienced.

Getting Started

Before I begin, it should be noted that SLi sets should not be overclocked or tested together. Doing so may be possible, but the results will be very inaccurate and usually much lower. It is best to remove one video card while overclocking and testing on the other, and then vice versa. This allows each video card to achieve the highest possible individual overclock settings.

The first screen displayed will be the only screen that is really needed. While the memory clock speed copies its adjustments across all three performance levels (2D, 3D low power, and 3D performance), the software allows the user to make individual speed settings to the GPU core clock speed. Initially, the default values are saved in the profile named "default", but as you make changes you may save and delete profiles as needed. For this project, I kept raising my speeds and saving them into a profile named "MAX".

 ATITool Artifact Scan

To begin the overclocking process, I start by raising the temperature of the GPU core by using the "Show 3D View" button to display a rotating fuzzy cube. It is critical that the video card attain the highest temperature possible prior to overclocking, because the results of a cold overclock may prove unstable during gaming conditions. After reaching the loaded operating temperature, I change to the "Scan for Artifacts" view by pressing the button. Although I could have used the "Find Max Core & Mem" buttons to have ATITool automatically work out the best settings, I choose to manually test each incremental improvement on my own.

Find the Best Speeds

Experience has taught me that overclocking the memory is the best starting point, since it has a very small impact on operating temperatures when compared to overclocking the GPU. I have also learned that clock speed improvements can be made in larger steps on RAM (10MHz steps), then they can be made on the GPU (3MHz steps). It is recommended that each clock be adjusted in small increments one after the other. Do not find the maximum RAM clock and then set out to find the best GPU clock; this will give skewed results at either end.

In my testing, I found that this particular card could operate a maximum RAM clock speed of 1060MHz (2120MHz) with the default GPU core clock. Alternatively, the maximum GPU clock could be raised up to 618MHz while maintaining the default RAM clock speed. However, the best combination of the two results yielded a stable 600MHz GPU core clock (25MHz improvement) with a final RAM clock speed of 1030MHz (130MHz improvement). These settings were then saved to the profile I named "MAX", and tested for a ten-minute duration using the "Scan for Artifacts" function. After stability was successfully tested, I played some of my favorite video games for an hour to confirm real-world stability.

Once the maximum stable speeds for both GPU and RAM have been found and tested, it is time to make a big decision: do I keep using the ATITool software to overclock my video card, or should I program the new settings into the video card BIOS and make the changes permanent? Since I will eventually use this video card as part of an SLi set, I will need to flash the settings to the video BIOS of each card for best results.

Discuss this item in the Benchmark Reviews Forum...



# InterestingSteve 2012-10-25 11:25
Right now I have my 8800GTS clocked at 670mhz on the core, and 1030 on the memory. I hit about 86*c max with furmark.

Was yours not stable at those speeds? Mine has been rock solid through hours of games. Even with a 350w power supply.
Report Comment
# RE: How To Overclock the NVIDIA GeForce 8800 Seriesmiguel 2013-12-12 17:21
okay ive been using my geforce 8800 gts without overclocking and its decent but would i fry it out if i overclock it?
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter