GeForce 2 series: Difference between revisions
mNo edit summary |
No edit summary |
||
Line 8: | Line 8: | ||
There were three more revisions on the GeForce 2 GTS core - the first was the '''GeForce 2 Ultra''', launched in late [[2000]]. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first [[GeForce 3]] products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the '''GeForce 2 Pro''' and the '''GeForce 2 Ti''' (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end [[GeForce 3]], which lacked a mass-market version. |
There were three more revisions on the GeForce 2 GTS core - the first was the '''GeForce 2 Ultra''', launched in late [[2000]]. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first [[GeForce 3]] products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the '''GeForce 2 Pro''' and the '''GeForce 2 Ti''' (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end [[GeForce 3]], which lacked a mass-market version. |
||
Finally, the most successful Geforce 2 part was budget-model '''GeForce 2 MX'''. In terms of sales volume, no graphics processor before or since has matched the GeForce 2 MX (and its variants.) The MX retained the GTS's core 3D-architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support for the MX. (The GTS could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.) |
Finally, the most successful Geforce 2 part was budget-model '''GeForce 2 MX'''. In terms of sales volume, no graphics processor before or since has matched the GeForce 2 MX (and its variants.) The MX retained the GTS's core 3D-architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support for the MX. (The GTS and subsequent models could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.) |
||
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older '''TNT2''' cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. |
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older '''TNT2''' cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. |
Revision as of 14:41, 25 June 2005
The GeForce 2 was the second generation of GeForce graphics cards by NVIDIA Corporation.
The first model, GeForce 2 GTS, was named for its texels rate of 1.6 billion per second - GigaTexel Shader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200MHz vs 120MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the GeForce 256 (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called HDVP (high definition video processor.) HDVP supported motion video-playback at HDTV-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU.
In 3D benchmarks and gaming applications, the GTS outperformed its predecessor (Geforce256) by up to 30%. (The GTS was only 10% faster than the DDR-SDRAM Geforce.) In OpenGL games (such as Quake III), the GTS outperformed the ATI Radeon and 3DFX Voodoo 5 cards in both 16bpp and 32bpp (true-color) display modes. But in Direct3D games, the Radeon was sometimes able to take the lead in 32-bit colour modes. This was attributed to the Radeon's faster memory and superior memory controller. The state of the PC gaming software, and the newness of DirectX 7 (API), likely limited the amount of game software able to take advantage hardware multitexturing capabilities. Most games emphasized single-layer artwork, which would not benefit from multitexturing hardware found in the GeForce 2 or Radeon.
There were three more revisions on the GeForce 2 GTS core - the first was the GeForce 2 Ultra, launched in late 2000. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first GeForce 3 products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the GeForce 2 Pro and the GeForce 2 Ti (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end GeForce 3, which lacked a mass-market version.
Finally, the most successful Geforce 2 part was budget-model GeForce 2 MX. In terms of sales volume, no graphics processor before or since has matched the GeForce 2 MX (and its variants.) The MX retained the GTS's core 3D-architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support for the MX. (The GTS and subsequent models could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.)
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older TNT2 cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR.
Models
(Performance ranking, slowest to fastest)
- GeForce 2 MX200
- GeForce 2 MX
- GeForce 2 MX400
- GeForce 2 GTS
- GeForce 2 Pro
- GeForce 2 Ti
- GeForce 2 Ultra
Chipset table
GeForce2 Chipset | Triangles per second (Million) | Pixels Per Second (Gigapixels) | Memory Bandwidth (GB/s) |
---|---|---|---|
Ultra | 31 | 1.0 | 7.3 |
Ti | 31 | 1.0 | 6.4 |
PRO | 25 | 0.8 | 6.4 |
GTS | 25 | 0.8 | 5.3 |