Jump to content

GeForce 3 series: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
ONjA (talk | contribs)
z-buff
ONjA (talk | contribs)
m see also
Line 14: Line 14:


The GeForce 3 enjoyed undisputed performance supremacy throughout its lifetime. Unlike most Nvidia graphics products, the GeForce 3 was always aimed at the high-end and upper-midrange gaming performance market, and at no time was there a cheap, entry-level version of it—nor was there any need for one, as the numerous GeForce 2 variants were well placed to serve in the mass-market role. Even the Ti200 (which was released relatively late in the chip's life) was priced squarely at the upper end of the mid-range, and had performance to match.
The GeForce 3 enjoyed undisputed performance supremacy throughout its lifetime. Unlike most Nvidia graphics products, the GeForce 3 was always aimed at the high-end and upper-midrange gaming performance market, and at no time was there a cheap, entry-level version of it—nor was there any need for one, as the numerous GeForce 2 variants were well placed to serve in the mass-market role. Even the Ti200 (which was released relatively late in the chip's life) was priced squarely at the upper end of the mid-range, and had performance to match.

== See also ==

* [[Comparison of NVIDIA Graphics Processing Units]]


==External links==
==External links==

Revision as of 15:01, 28 November 2005

NVIDIA Logo
NVIDIA Logo

The GeForce 3 (codenamed NV20) was NVIDIA's third-generation GeForce chip. The range included three related chips—the original GeForce 3 and later, the GeForce 3 Ti500 and the GeForce 3 Ti200. The professional version of the GeForce 3 was the Quadro DCC. The NV2A graphics chip in the Xbox is very similar to the GeForce 3 but it has two vertex shader units over the GeForce 3 line's single vertex shader unit.

The first GeForce 3 chips were released in March 2001, three months after Nvidia bought out the near-defunct 3dfx. It differed from the past-generation GeForce 256/GeForce 2 in three main areas—the first was the addition of programmable vertex and pixel shaders—specialised units (required under the DirectX 8.0 specification) designed to execute small custom transform and lighting and per-pixel programs, greatly increasing the card's flexibility. (The GeForce 2 had non-programmable pixel shaders which were little used outside of tech demos.) The second was Lightspeed Memory Architecture (LMA), which was designed to exclude overdrawn (objects obscured from view) objects from processing, conserve memory bandwidth by compressing Z-buffer (depth buffer) and also managed the memory bus better, whereas the earlier GeForce chips lacked hardware memory management. The third was the changing of anti-aliasing from super-sampling to multi-sampling, which was more efficient.

In terms of performance, it sometimes lost to the GeForce2 Ultra (which was clocked higher - 250 Core/460 Memory vs 200 Core/460 Memory on GeForce 3). However the GeForce 3 had the lead when anti-aliasing was enabled, and its vertex and pixel shaders for future games enabled the GeForce 3 to replace the GeForce 2 Ultra as Nvidia's top of the line product. The GeForce 3 easily outperformed the GeForce 2, Radeon and Voodoo 5 cards (it probably would have outperformed the never-released Voodoo 5 6000 especially when anti-aliasing was enabled).

The second revisions, the GeForce 3 Ti500 and Ti200 were released in October 2001, at around the same time as ATI's top-line Radeon 8500 and 7500. The Ti500 had higher core and memory clocks (250 Core/500 Memory), and was designed to outperform the Radeon 8500; initially the Ti500 held the upper hand but eventually the 8500's driver improvements enabled it to reach and even surpass the Ti500 in some cases. In addition, the 8500 was significantly cheaper than the Ti500 and carried dual-monitor support which the entire GeForce 3 series lacked. As a result, the Ti500 was never widespread and it was replaced in spring 2002 by the GeForce 4 Ti line, which had slightly better performance, included dual-monitor support, and most importantly could be produced at lower cost.

GeForce3 Ti 200 GPU
GeForce3 Ti 200 GPU

The Ti200 was a cheaper chip (175 Core/400 Memory) clocked lower than the original GeForce 3 (200 Core/460 Memory), and it easily surpassed the Radeon 7500 in speed and feature set outside of dual-monitor implementation. Some say that the Ti500's yields were unexpectedly poor, and the Ti200 was a way of making up the cost of making the chips, though this has never been confirmed. At half the price of the Ti500 and delivering the same features and much of the performance, the Ti200 proved popular with enthusiasts, as it could be overclocked to run at GeForce 3 speeds (and Ti500 speeds in some cases). ATI rolled out the Radeon 8500LE in early 2002 to compete in the niche occupied by the Ti200.

The GeForce 2 and GeForce 3 lines were replaced in early 2002 by the GeForce 4 MX and Ti lines, respectively. However, the GeForce 3 Ti200 was still kept into production for a short while as it occupied a spot between the Ti 4200 and MX 460 in performance. The discontinuing of the GeForce 3 Ti200 and Radeon 8500LE dissapointed many enthusiasts, because the performance-oriented Ti 4200 had not yet fallen to midrange prices, while the mass-market Radeon 9000 was not as fast as the Ti200 and 8500LE.

The GeForce 3 enjoyed undisputed performance supremacy throughout its lifetime. Unlike most Nvidia graphics products, the GeForce 3 was always aimed at the high-end and upper-midrange gaming performance market, and at no time was there a cheap, entry-level version of it—nor was there any need for one, as the numerous GeForce 2 variants were well placed to serve in the mass-market role. Even the Ti200 (which was released relatively late in the chip's life) was priced squarely at the upper end of the mid-range, and had performance to match.

See also