Just over a month after NVIDIA stole the show with the introduction of their G92 based 8800GT, they have re-introduced the 8800GTS based on the G92 to up the ante with ATI once again. Today, we will take the 8800GTS offerings from Asus and Inno3D for a spin and see how much of a difference the 16 additional Unified Shader Processers make.
NVIDIA has a pretty anal way of naming their graphics accelerators, you’ve got to admit. First, they dump out a dung load of numbers for every series, and then they add a couple of letters behind those numbers to confuse you. Just look at the existing GeForce 8 series. There’s already the 8400, 8500, 8600, 8800 and the rumoured 8900 on the list, and more to come as NVIDIA milks the old cow dry. NVIDIA knows that numbers alone don’t cut it for being a pain-in-the-ass, so they put nonsensical letters after the numbers to irritate consumers. Think GS, GT, GTS, things that never come close to sniffing a bell in that vast acronym dictionary of yours. For those with deeper pockets, NVDIA makes it easier for you to remember their range. The largest number is married to a corny "Ultra" right at the top.
Why put out a paragraph of rant here right on a beautiful first page? It’s got nothing to do with a review, doesn’t it? In truth, the 8800GTS has already been reviewed on our pages many moons ago. This is not a re-review of a different item, but rather one that checks out a brand new offering from the graphical acceleration giant. You geeks have to thank that wonderous geek brain of yours for remembering every card in the GeForce series and knowing a couple more of FireGLs and Radeons. The 8800GTS we have with us today isn’t that G80 with a 320/640MB buffer you used to know. This 8800GTS is based in the all new G92 graphics chipset that had it debut only a while ago in the guise of the 8800GT. You could call the G92 8800GTS a beefed up 8800GT, or you could look at it as though the 8800GT was a castrated G92 8800GTS, either way, the two 8800s are very similar, sharing more than just the graphics chipset.
We received 8800GTS cards from both Asus and Inno3D!
The G92 is the result of what the industry calls a "die-shrink." You don’t have to mourn over a die-shrink because it usually means that things are getting better. Die-shrinks have had their fair share of eventful history and are generally known for a number of improvements assuming that nothing goes terribly wrong in the shrinking process. You know that the QX9650 was a good shrink because it clocks better than the Q6850 and consumes less power. You know Prescott was a bad shrink because it ran hotter than a Northwood and killed many motherboards meant for the latter. For graphics chipsets, as it is for CPUs, a die shrink usually means a combination of things, including lesser power consumption, less heat produced, a smaller core size, potential for squeezing more complex circuitry into the same-sized device, and faster switching transistors that translates to higher clockspeeds. Now that a single wafer can pack in more chips, it also means less cost per chip, so kudos to NVIDIA for the effort.
The new G92 is 65nm, just like the Core 2 Duo Conroe CPUs on the market right now. You’d see the 8800GTS in 512MB versions because there’s been changes made to the memory controller onboard the G92 chipset. The memory runs on a 256 bit bus now unlike the 320/384 bit memory bus of the G80, so you won’t see the 8800GT and 8800GTS offered with 320/384MB or 640/768MB video buffers. G92 on a 8800GTS has 128 Stream Processors, 16 (possibly one cluster) more than the 8800GT. The display chipset has also been embedded into G92 silicon, so there’s no need for a stray mBGA device to create the DVI outputs. PCIe 2.0 is supported on the 8800GTS, as is the new video processing engine for H.264 decoding. High Definition video with High Definition gaming seems like what the 8800GTS is about to satisfy. With LCD prices on the cheap, HD is indeed the way to go for the most immersive experience.
For today, we’d pit the two G92 brothers against each other on the arena using the Forceware 169.06 drivers. The prizes for the winner? Umm, we’d talk about that later!