How To Overclock the NVIDIA GeForce 8800 Series |
Articles - Featured Guides | |
Written by Olin Coles | |
Sunday, 18 February 2007 | |
Overclocking the GPUEDITORS NOTE: This instructional article is available as an archived reference. The updated replacement is titled: Overclocking the NVIDIA GeForce Video Card and is now available. The new article will receive periodic updates and become part of a larger series focused at optimizing, tuning, and overclocking the computer system. Overclocking can take on many forms, and the practice can range from minor product improvements to a total re-engineering project which completely alters the hardware. In this article, I will concentrate my efforts towards achieving the most gain with the least amount of effort. Some industry voices have called overclocking a hobby, while others have compared it to product misuse. However, I believe that if you are reading this article, you are probably one of the many computer enthusiasts who believe it is perfectly acceptable to get something more out of a product without it costing more money. When I think about it, everyone enjoys getting something for nothing; it's human nature. Additionally, it is human nature to blame someone else if something goes wrong. This is where I warn you, the reader of this article, that neither I nor benchmarkreviews.com recommend that you overclock your video card. This article explains how I conducted these experiments on my own property. Benchmarkreviews.com and the author of this article will not be responsible for damages or injury resulting from experiments you choose to conduct on your own property. If you read beyond this point, you are accepting responsibility for your own actions and hold Olin Coles and the staff of benchmarkreviews.com harmless. NVIDIA has recently released their GeForce 8800 Ultra video card, which is now the crown jewel in gaming performance. However, computer hardware and gaming enthusiasts have learned is really just an overclocked GeForce 8800 GTX. Although this guide teaches you how to overclock all NVIDIA video cards, I have selected the FOXCONN NVIDIA GeForce 8800 GTS as my test subject. Most all of the older GeForce 6 and 7 generation video cards may also be overclocked using the same methods discussed here. In this how-to article you will learn how to turn your GeForce 8800 GTX into an Ultra, or you can overclock any NVIDIA video card for better performance; and do it all for free! ![]() Presently, the 640MB version of the GeForce 8800 GTS is the third-best video card available on the market, after the Ultra and GTX. Hardcore gamers and computer enthusiasts alike have already speculated on how the GTS could be made to perform to the same level as the GeForce 8800 GTX with some tweaking. Unfortunately, this just isn't possible because of architecture. What is possible though is taking a great product and making perform better; which is exactly what I did. By default, this particular FOXCONN 8800 GTS operates with a 575MHz G80 GPU core clock speed, a 1188MHz shader, and a 900MHz (1800MHz) GDDR3 RAM speed. I will utilize the free overclocking utility ATITool to search out the best clock speeds and simulate heavy graphic loading to establish stability. I will then make use of NiBiTor which is another free tool to program a custom flashable video BIOS. After creating the new custom video BIOS with NiBiTor, I will use yet another free application, nvFlash, to program the new custom video BIOS onto the GeForce 8800 GTS. Sure the name says "ATI" Tool, but the author has made this a great tool for both ATI and NVIDIA products for several years now. I have personally used many different tools for overclocking in the past, but this has proven to be the best tool for all my needs. After a straightforward installation and reboot (to complete the installation of the driver-level service), I open ATITool and see a lot of options confronting me. Try not to be overwhelmed, since all of these options could strike fear in the hearts of the inexperienced. Getting StartedBefore I begin, it should be noted that SLi sets should not be overclocked or tested together. Doing so may be possible, but the results will be very inaccurate and usually much lower. It is best to remove one video card while overclocking and testing on the other, and then vice versa. This allows each video card to achieve the highest possible individual overclock settings. The first screen displayed will be the only screen that is really needed. While the memory clock speed copies its adjustments across all three performance levels (2D, 3D low power, and 3D performance), the software allows the user to make individual speed settings to the GPU core clock speed. Initially, the default values are saved in the profile named "default", but as you make changes you may save and delete profiles as needed. For this project, I kept raising my speeds and saving them into a profile named "MAX". To begin the overclocking process, I start by raising the temperature of the GPU core by using the "Show 3D View" button to display a rotating fuzzy cube. It is critical that the video card attain the highest temperature possible prior to overclocking, because the results of a cold overclock may prove unstable during gaming conditions. After reaching the loaded operating temperature, I change to the "Scan for Artifacts" view by pressing the button. Although I could have used the "Find Max Core & Mem" buttons to have ATITool automatically work out the best settings, I choose to manually test each incremental improvement on my own. Find the Best SpeedsExperience has taught me that overclocking the memory is the best starting point, since it has a very small impact on operating temperatures when compared to overclocking the GPU. I have also learned that clock speed improvements can be made in larger steps on RAM (10MHz steps), then they can be made on the GPU (3MHz steps). It is recommended that each clock be adjusted in small increments one after the other. Do not find the maximum RAM clock and then set out to find the best GPU clock; this will give skewed results at either end. In my testing, I found that this particular card could operate a maximum RAM clock speed of 1060MHz (2120MHz) with the default GPU core clock. Alternatively, the maximum GPU clock could be raised up to 618MHz while maintaining the default RAM clock speed. However, the best combination of the two results yielded a stable 600MHz GPU core clock (25MHz improvement) with a final RAM clock speed of 1030MHz (130MHz improvement). These settings were then saved to the profile I named "MAX", and tested for a ten-minute duration using the "Scan for Artifacts" function. After stability was successfully tested, I played some of my favorite video games for an hour to confirm real-world stability. Once the maximum stable speeds for both GPU and RAM have been found and tested, it is time to make a big decision: do I keep using the ATITool software to overclock my video card, or should I program the new settings into the video card BIOS and make the changes permanent? Since I will eventually use this video card as part of an SLi set, I will need to flash the settings to the video BIOS of each card for best results. Discuss this item in the Benchmark Reviews Forum... Programming the BIOSEDITORS NOTE: This instructional article is available as an archived reference. The updated replacement is titled: Overclocking the NVIDIA GeForce Video Card and is now available. The new article will receive periodic updates and become part of a larger series focused at optimizing, tuning, and overclocking the computer system. Welcome to page two of my guide. On the first page, I discovered the maximum potential of my video card. Since I have already tested my overclocking results with both artificial loading and real-world usage, it is now safe enough for me to modify the video card BIOS file with my new overclocked settings and make the FOXCONN 8800 GTS operate with enhanced performance without additional software. This will make the card identical to factory overclocked versions. Unfortunately, part of this process requires that the system boot into MS-DOS with a 3.5" floppy disk drive, which I normally don't have installed in my computer because it is considered an obsolete legacy piece of hardware. The USB flash drive and recordable optical drives have proven themselves to be very good solutions in terms of suitable replacements, so it may become a difficult task locating a 3.5" floppy disk drive any further into the future. For the remainder of this project, a bootable USB flash drive or a properly created bootable CD/DVD could have been substituted, but I choose to avoid reinventing the wheel and retained the use of a spare floppy disk drive to complete this project. ![]() To begin the (simple) process of creating a custom BIOS file, I utilized the free NiBiTor program to save a copy of my original BIOS. There are two steps for this process:
Save a backupI should only continue after creating a backup of the original video BIOS, since this file will allow the opportunity to return my settings back the factory defaults if it is every required. It is highly recommended that this backup BIOS file be copied and renamed so there will be both a working copy and the original backup file available. I named the original BIOS file "BACKUP.ROM", and then created a copy of it which I named "8800GTS2.ROM". Additionally, the good people at MVKTech who helped build NiBiTor have also created a BIOS repository, where they kindly request that you upload your original BIOS file. This could be useful later if you happen to misplace your backup, or want to see what other manufacturers have done to tweak their BIOS configurations. Up to this point, I have saved my modified video BIOS "8800GTS2.ROM" onto one floppy disk. I will then save the nvFlash program (which is available for free download) onto this same floppy disk. On a second floppy disk I will use Windows XP to format and create an MS-DOS startup disk. It is may not required to split the project files and nvFlash from the MS-DOS startup disk because of file sizes and available space, but this is a safe practice which also decreases the chance for possible media problems. Now that a backup of the video BIOS has been copied and stored for safe keeping, the next step for reading the working file and making changes begins with: File → Open BIOS (Choose the renamed copy "8800GTS2.ROM" of the video BIOS here). ![]() Once again, the novice could become very concerned about the many tabs and options available in NiBiTor, but for my purposes I will only use the Clockrates tab to change the values for 3D speeds. Using the values discovered and tested to be stable in my previous steps, I apply these values into the appropriate 3D fields, replacing the original values. Once I have typed in the new Core and Memory values, I saved this modified video BIOS file by choosing: File → Save BIOS (save this modified file onto a formatted floppy disk and name the file something simple with less then eight characters such as "8800GTS2.ROM"). ![]() Flash the BIOS with nvFlash v5.40Now that I have prepared the new modified video BIOS file and saved it to a floppy disk, I am ready to make my video card operate as if it came from the factory with my new settings. I will now flash my working copy of the modified video BIOS "8800GTS2.ROM" onto the FOXCONN 8800 GTS using nvFlash. Flash the new BIOSFlashing a video BIOS is a very simple process; yet extra care and precaution must be taken or the hardware being flashed may be rendered non-operational. I have taken steps to ensure my computer systems stability will not be compromised by removing any system component overclocking (CPU, RAM, bus speeds), and have placed my system on a 1500VA backup battery UPS. With stability ensured, I am ready to move forward and flash the video BIOS. The video card BIOS is flashed by:
That's it! The hard part is done. Once the system reboots after the successful BIOS flash, the video card is programmed with the new enhanced performance settings. But am I finished? Discuss this item in the Benchmark Reviews Forum... Cooling ImprovementsEDITORS NOTE: This instructional article is available as an archived reference. The updated replacement is titled: Overclocking the NVIDIA GeForce Video Card and is now available. The new article will receive periodic updates and become part of a larger series focused at optimizing, tuning, and overclocking the computer system. With all this new power there will come increased heat output, which is something the GeForce 8800 series already knows plenty about. The GeForce 8800 GTS already runs close to 90° C when it is under full load, so I have taken an extra step to make sure my temperatures don't turn this product into a personal space heater. Using the NVIDIA nTune Performance Application available free, I can adjust the fan speed from the default 60% output up to the desired 100% output. Alternatively, I could use NiBiTor to set the blower fan to 100% output full time. However, since the added fan noise might become a problem when I am not playing video games, nTune may be preferred. Don't be fooledA closer look at the nTune utility will reveal that it offers the opportunity to overclock the video card through the GPU clock settings interface; but I had to discover the hard way that this was a very unstable and unsafe method which always resulted in system crashes. I have since avoided every feature offered in this utility except the GPU fan settings feature. ![]() If I could find a better program with a smaller footprint which would enable me to manually adjust 8800 series blower fan speeds, I would be using it. But since this is the only one I am aware of, it is a necessary evil. EDIT: Some readers have pointed out that Riva Tuner has the same fan control ability. While I have confirmed this function in GeForce 6 and 7 series video cards, I have not tested this software with the GeForce 8 series. It was mentioned that the author of RivaTuner is planning on an updated release in the future. With the NVIDIA nTune utility, I can manually raise (or lower) the fan controller output. The blower fan on the GeForce 8800 series is somewhat silent at the default 60% output, but it gets humming when set up to 100% output which means that noise may become a concern. ConclusionFor the cost of the product and about an hour of time spent, I was able to take my FOXCONN 8800 GTS and overclock the G80 GPU up to 600MHz (575MHz default) along with a GDDR3 RAM speed increase up to 1030MHz (900MHz default). This amounts to a 25MHz GPU improvement and a 130MHz RAM improvement; and it was all for free! Sure, these results did not transform my 8800 GTS into a GTX, but as I mentioned before this is just plain impossible because of the architecture. When it was all said and done, I did make enough improvement to the 8800 GTS to keep it more relative to the 8800 GTX. My video card was already a product pushed closely to the limit from the factory, and now I have it operating and performing as best as it can. Now just imagine what could be done to the GeForce 8800 GTX! The overpriced GeForce 8800 Ultra has just been released, and it seems like NVIDIA has opened the floodgates to overclockers wanting to turn their GTX into and Ultra. It sure seems possible; all it would take is following these directions and a little time. With this in mind, it could be very possible to find more performance available for the taking out of other NVIDIA video cards. Just remember, what you do with your property is your own business. At least now you know how I did it with mine. Discuss this item in the Benchmark Reviews Forum...
Related Articles:
|
Comments
Was yours not stable at those speeds? Mine has been rock solid through hours of games. Even with a 350w power supply.