Video card nVidia Geforce 8800 GT: characteristics, comparison by competitors and reviews
The advent of the 8800 GTX was a landmark event in the history of 3D graphics. It was the first card with support for DirectX 10 and a single shader model associated with it, which significantly improved image quality compared to previous generations, and in terms of performance, it remained for a long time without competitors. Unfortunately, all this power and cost, respectively. Considering the expected competition from ATI and the release of cheaper mid-range models based on the same technology, the GTX was considered a map focused only on those enthusiasts who wanted to be at the forefront of modern graphics processing.
To remedy this situation, nVidia released a card of the same GTS 640MB line after a month, and a couple of months later - GTS 320MB. Both offered performance close to the GTX, but at a much more reasonable price.However, at a cost of about 300-350 dollars, they were still too expensive for gamers with a limited budget - these were not medium, but high-end models. Looking back, it can be said that GTS was worth every cent invested in them, since what followed them during the rest of 2007 caused one disappointment after the other.
The first appeared cards of the estimated average price range of 8600 GTS and GT, which were heavily trimmed versions of the 8800 series. They were smaller and quieter and had new possibilities for processing HD video, but their performance was lower than expected. Purchasing them was impractical, although they were relatively inexpensive. The ATI Radeon HD 2900 XT alternative graphics card matched the GTS 640MB in terms of speed, but under load it consumed a tremendous amount of power and was too expensive to be in the middle range. Finally, ATI tried to launch the DX10 series in the form of an HD 2600 XT and Pro, whose multimedia capabilities were even better than those of the nVidia 8600, but they didn’t have enough power to become worthy of the attention of gamers who had already bought previous-generation video cards or 7900 GS.
And now, a year after the start of sales of the 8800 GTX with the release of the 8800 GT, the first true update of the model with support for DirectX 10 appeared. Although it took a long time, the nVidia GeForce 8800 GT had GTS specifications and the cost was between $ 200-250. Finally, it reached the average price range that everyone had been waiting for. But what made the map so special?
More does not mean better
With the development of technology and the number of transistors in the CPU and the GPU there is a natural need to reduce their size. This leads to lower power consumption, which in turn means less heat. On a single silicon crystal, more processors fit, which reduces their cost and theoretically gives a lower limit to the price of the equipment made from them. However, the change in production processes presents high risks to the business, so it is customary to release a completely new architecture on existing and tested technologies, as was the case with the 8800 GTX and HD 2900 XT. With the improvement of the architecture, a transition to less energy-intensive hardware is taking place, on which a new design is later based again.
The 8800 series followed this path with G80 cores on the GTX and GTS, produced using 90-nm technology, and nVidia GeForce 8800 GT is based on the G92 chip, already made using the 65-nm process. Although the change does not seem very large, it is equivalent to a 34% reduction in the size of the substrate or a 34% increase in the number of processors on the silicon wafer. As a result, electronic components become smaller, cheaper, more economical, which is an extremely positive change. However, the core of the G92 has not just decreased, there is something else.
First of all, the VP2 video processing engine, which was used in the 8600 series, has now appeared in the GeForce 8800 GT 512MB. So now you can enjoy high definition video without braking the system. The final display engine, which is controlled by a separate 8800 GTX chip, is also integrated into the G92. As a result, there were 73 million transistors more on the chip than the 8800 GTX (754 million versus 681 million), although the number of stream processors, texture processing capacity and ROP was less than that of the more powerful model.
The new version of the nVidia transparent anti-aliasing algorithm, added to the GeForce 8800 GT arsenal, is designed to significantly improve the image quality while maintaining high system performance. In addition, the new graphics capabilities of the new processor did not add.
The company, apparently, had been thinking for a long time about what kind of functionality of the previous 8800 series cards was not fully used and could be reduced and which should have been left. The result was the design of the graphics processor, which, in terms of performance, is located somewhere between the GTX and the GTS, but with the functionality of GTS. As a result, the 8800 GTS map has become completely redundant. The 8800 Ultra and GTX still provide higher graphics performance, but with fewer features, at a much higher price and with high power consumption. Against this background, the GeForce 8800 GT 512MB card really took a strong position.
The GeForce 8800 GT uses the same unified architecture that nVidia introduced when it first announced the release of the G80 processor. The G92 consists of 754 million transistors and is manufactured using the 65-nm TSMC process. The size of the substrate is about 330 mm2, and although it is noticeably smaller than that of the G80, it is still far from being called a small piece of silicon. In total, there are 112 scalar stream cores, which in the standard configuration operate at a frequency of 1500 MHz. They are grouped into 7 clusters, each of which has 16 stream processors that share 8 address texture blocks, 8 texture filter sections and its own independent cache.This is the same configuration that nVidia used in the G84 and G86 chips at the shader cluster level, but the G92 is a much more complex GPU than any of them.
Each of the shader processors in one clock can form two MADD and MUL commands, combined into a single structure, the blocks can handle all shader operations and calculations, which come in both integer and floating-point form. It is curious, however, that, despite the possibility of stream processors being the same as the G80 (except for the number and frequency), nVidia claims that the chip can perform up to 336 GFLOPS. However, a 504 GFLOPS speed is required to calculate NADD and MUL. As it turned out, the manufacturer used a conservative approach to determining computing power and did not take into account MUL in calculating the overall performance. At the briefings and round tables, representatives of nVidia said that some architectural improvements should allow the chip to approach its theoretically maximum throughput. In particular, the Task Manager has been improved, distributing and balancing the data that comes along the pipeline. NVidia has announced that it will support the dual accuracy of future graphics processors, but this chip only emulates it, which is due to the need to follow IEEE standards.
The structure of the ROP at G92 is similar to the structure of any other graphic processor of the GeForce family of the eighth series. This means that each section has a second-level cache and is assigned to a 64-bit memory channel. In total, there are 4 sections ROP and 256-bit data storage interface. Each of them is capable of processing 4 pixels per clock, if each of them is defined by four parameters (RGB and Z colors). If only the Z component is present, then each section can handle 32 pixels per clock.
ROPs support all common anti-aliasing formats used in previous GPUs of the eighth GeForce series. Since the chip has a 256-bit GDDR interface, nVidia decided to make some improvements in ROP compression efficiency to reduce the use of bandwidth and graphics memory with anti-aliasing turned on at 1600x1200 and 1920x1200 resolutions.
As a derivative of the original G80 architecture, filter and address texture blocks, as well as ROP sections, operate at a clock speed that is different from the frequency of the stream processors. The nVidia company calls it the main speed. In the case of the GeForce 8800 GT, the characteristics of the video card are determined by the frequency of 600 MHz. Theoretically, this results in a fill rate of 9600 gigapixels per second (Gp / s) and a bilinear texture fill rate of 33.6 Gp / s.According to users, the clock frequency is very low, and the increase in the number of transistors does not guarantee the addition or preservation of functionality. When the company switched from 110-nm to 90-nm technology, thanks to optimization, it reduced the number of transistors by 10%. Therefore, it will not be surprising if there are at least 16 stream processors disconnected in this product on the chip.
The reference design of the card provides for the operation of the core, shader unit and memory at 600 MHz, 1500 MHz and 1800 MHz, respectively. The 8800 GT has a single-slot cooling system, and the glossy black metal casing almost completely hides its front side. The fan with a diameter of 50 mm corresponds to the design of the radial coolers of the top models and performs its duties very quietly in all modes of operation. It doesn’t matter if the computer is idling, loaded only with the Windows desktop, or if the favorite game is running - it will hardly be heard against other noise sources in the PC case. However, it is worth noting that when you first turn on the computer with a new video card, you can be frightened.The fan begins to howl when the graphics processor is loaded at full capacity, but the noise subsides even before the desktop appears.
The metal front panel attracts fingerprints, but this should be of little concern, since after installation it will be impossible to see them. According to user feedback, the cover helps prevent accidental damage to components, such as capacitors, on the front of the card. Printed circuit board, painted in green, in combination with a black front panel of the radiator, provides 8800 GT recognition. The model is marked with the GeForce logo along the top edge of the front panel. Mark Rein, the company's vice president, told reporters that this was an additional expense, but it was necessary to help users figure out which video card is at the heart of the system at LAN parties.
Under the radiator, there are eight 512-megabit graphics memory chips, which in total gives 512 MB of data storage. This is GDDR3 DRAM with an effective frequency of up to 2000 MHz. The graphics processor supports both GDDR3 and GDDR4, but this feature has not been used in this series.
Heat and power consumption
The nVidia GeForce 8800 GT is very sexy.Its design is simply very pleasing to the eye and, in view of the internal changes in the G92, a feeling of a sustained design comes from it.
More important than aesthetic aspects, however, according to users, is the fact that the manufacturer managed to fit all the power into a single-slot device. This is not just a welcome change, it pleasantly surprises. At GeForce 8800 GT, the characteristics are such that it can be assumed that there is a cooler with a height of two slots. The reason why nVidia cost such a thin design was the change in the production process, which reduced heat to a level that a low-profile fan could handle. In fact, the temperature mode has decreased so much that even a relatively small cooler does not have to rotate very quickly, as a result of which the card remains virtually silent even during the processing of intensive games. However, the temperature of the board rises significantly, therefore a significant amount of air is required to prevent overheating. As a result of reducing the technological process, the GeForce 8800 GT 512 MB consumes only 105 W even under full load. Thus, only one six-pin power connector is required. This is another nice change.
The first card began to support PCIe 2.0, which allows receiving power up to 150 watts. However, the company decided that for backward compatibility it is much easier to limit the power through it to 75 watts. This means that regardless of whether the card is connected to motherboards with PCIe 1.1 or PCIe 2.0, only 75 watts flow through the connector, and the rest of the energy comes through an additional connector.
Speaking about the possibility of transmitting HDCP signals, it is worth touching on the video processor of a new generation, which nVidia has incorporated into the G92. VP2 is a single programmable SIMD processor, the flexibility of which allows it to expand in the future. It provides very intensive processing of video encoded in H.264, shifting the load from the CPU to the GPU. In addition to VP2, there is also an H.264 stream processor and an AES128 decoder. The first of these is specifically designed to speed up the CAVLC and CABAC coding schemes - tasks that very much load the CPU in a purely software environment. AES128 enables faster processing of the encryption protocol required for video content security schemes such as AACS and Media Foundation. Both of these schemes require video encoding (both compressed and uncompressed) when transferring over buses like PCI-Express.
NVIDIA is trying hard to improve the transparent anti-aliasing technique, which first appeared in the 7th GeForce series. Multisampling reduces the performance of the card a little, but in most cases it is not effective. On the other hand, supersaypling provides a much better and more stable image quality, but at the expense of reducing the speed of work - this is an incredibly resource-intensive method of smoothing.
The drivers that come with the video card contain a new multisampling algorithm. The differences are quite significant, but the user makes the final decision. The good news is that since this is a driver-level change, any equipment that supports transparent anti-aliasing can use the new algorithm, including cards released after the GeForce 7800 GTX. To activate the new mode, you just need to download the latest updates on the manufacturer's website.
According to user reviews, the driver will not be able to upgrade for the GeForce 8800 GT. Although the video card's web page contains only links to files for Windows Vista and XP, a search from the main page allows you to find what you need.For nVidia GeForce 8800 GT, the Windows 7–10 drivers are installed by the GeForce 342.01 Driver utility with a capacity of 292 MB.
The nVidia GeForce 8800 GT's output connectors are quite standard - 2 dual-channel DVI-I-ports with HDCP support, which are suitable for both analog and digital interfaces of monitors and TVs, and the 7-pin analog video port provides the usual composite and component output. DVI connectors can be used in conjunction with a DVI-VGA and DVI-HDMI adapter, so any connection option is possible. Nevertheless, nVidia still makes audio support for use with HDMI connectors an optional option for third-party manufacturers — there is no audio processor inside the VP2, so the sound is realized through the S / PDIF connector on the board. This is disappointing because the thin and quiet card is ideal for gaming home theaters.
The GeForce 8800 GT is the first graphics card compatible with PCI Express 2.0, which means that it can access memory at 16 GB / s. - two times faster than the previous standard. While this may be useful for workstations and performing intensive calculations, this is not useful for an ordinary gamer. In any case, the standard is fully compatible with all previous versions of PCIe,so nothing to worry about.
Partner companies nVidia offer overclocked versions of the GeForce 8800 GT, as well as packages with games.
BioShock from 2K Games
BioShock was one of the best games that existed at the time of release of the video card. This is a “genetically modified” first-person shooter, the events of which unfold in the underwater city of Rapture, created by a man named Andrew Ryan at the bottom of the Atlantic Ocean as part of the realization of his dreams in the art deco style of the 1930s. 2K Boston and 2K Australia licensed and used Epic Games' Unreal Engine 3 to achieve the best effect, and also used some of the features of DirectX 10. All this is controlled through an option on the game's control panel.
The venue of BioShock forced developers to use a lot of water shaders. DirectX 10 technology has helped to improve the ripples when moving characters through the water, and pixel shaders have been massively used to create wet objects and surfaces. In addition, the DX10 version of the game uses a depth buffer to create “soft” particle effects, in which they interact with their surroundings and look more realistic.
nVidia GeForce 8800 GT, the characteristics of which allow it to show its strength in the BioShock game, at a resolution of 1680x1050 is only slightly inferior to the GTX.Increasing this parameter increases the gap between the cards, but not by a wide margin. The reason for this is probably the fact that the game did not support transparent anti-aliasing, and the massive advantage of the 8800 GTX in memory bandwidth becomes controversial.
According to user feedback, the 8800 GT also works quite well with SLI enabled. Although its capabilities do not closely match the GTX, the Radeon HD 2900 XT with a memory capacity of 512 MB in the CrossFire configuration is a competition. Perhaps even more interesting is the fact that at a resolution of 1920x1200 the 8800 GT video card is almost as fast as the GTS 640 MB!
Crysis Syngle Player Demo by Electronic Arts
This game will make the graphics card literally cry! The big surprise was her graphics - she surpassed everything that was in computer games before her. Testing with the integrated performance determinant of the GPU is much faster than in reality. About 25 fps in the performance test is enough to get a user-friendly frame rate. Unlike other games, the low frame rate here still looks pretty smooth.
NVidia GeForce 8800 GT,the characteristics of which in Crysis allow to achieve a sufficient frame rate at a resolution of 1680x1050 with high details under DirectX 10, not as fast as the GTX, but noticeably more productive than the Radeon HD 2900 XT and 8800 GTS 640MB. GTS 320MB hardly copes with Crysis and it will be necessary to reduce most settings to medium to get a frame rate higher than 25 fps even with an image quality of 1280 x 1024 pixels.
As was to be expected, the 8800 GTX remained unsurpassed, but overall, the GeForce 8800 GT GTS outperforms most tests. At the highest resolutions and anti-aliasing settings, the reduced memory bandwidth of the GT fails and the GTS rushes forward from time to time. However, given the price difference and other advantages, the 8800 GT is better anyway. Conversely, a comparison of GeForce GTX 8800 / GT 8800 each time confirms why the first card is so expensive. While other models begin to slow down significantly with the increase in the number of image points, with the use of transparent anti-aliasing and anisotropic filtering, the 8800 GTX continues to show excellent results. In particular, Team Fortress 2 with a resolution of 1920x1200 from 8xAA and 16xAF to 8800 GTX is twice as fast as on GT. However, for the most part, the GeForce 8800 GT graphics card performs well.Of course, if you do not take into account the incredibly low frame rate in Crysis.
Although the characteristics of the GeForce 8800 GT did not exceed the specifications of the leader of the 8800 GTX series, it provides close performance at a fraction of the price and also includes many additional features. And if you add a small size and quiet operation, the model will seem just phenomenal.