GeForce 2 series: Difference between revisions
Wabbit1245 (talk | contribs) |
Wabbit1245 (talk | contribs) mNo edit summary |
||
Line 4: | Line 4: | ||
The first model, '''GeForce 2 GTS''', was named for its [[Texel (graphics)|texels]] rate of 1.6 billion per second - '''G'''iga'''T'''exel '''S'''hader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200MHz vs 120MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the '''GeForce 256''' (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called '''HDVP''' (high definition video processor.) HDVP supported motion video-playback at [[HDTV]]-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU. |
The first model, '''GeForce 2 GTS''', was named for its [[Texel (graphics)|texels]] rate of 1.6 billion per second - '''G'''iga'''T'''exel '''S'''hader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200MHz vs 120MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the '''GeForce 256''' (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called '''HDVP''' (high definition video processor.) HDVP supported motion video-playback at [[HDTV]]-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU. |
||
In 3D benchmarks and gaming applications, the GTS outperformed its predecessor ( |
In 3D benchmarks and gaming applications, the GTS outperformed its predecessor (GeForce256) by up to 30%. (The GTS was only 10% faster than the [[DDR-SDRAM]] GeForce.) In [[OpenGL]] games (such as [[Quake III]]), the GTS outperformed the [[ATI]] [[Radeon]] and [[3DFX]] [[Voodoo 5]] cards in both 16bpp and 32bpp ([[true-color]]) display modes. But in [[Direct3D]] games, the Radeon was sometimes able to take the lead in 32-bit colour modes. This was attributed to the Radeon's faster memory and superior memory controller. The state of the PC gaming software, and the newness of DirectX 7 (API), likely limited the amount of game software able to take advantage hardware multitexturing capabilities. Most games emphasized single-layer artwork, which would not benefit from multitexturing hardware found in the GeForce 2 or Radeon. |
||
There were three more revisions on the GeForce 2 GTS core - the first was the '''GeForce 2 Ultra''', launched in late [[2000]]. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first [[GeForce 3]] products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the '''GeForce 2 Pro''' and the '''GeForce 2 Ti''' (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end [[GeForce 3]], which lacked a mass-market version. |
There were three more revisions on the GeForce 2 GTS core - the first was the '''GeForce 2 Ultra''', launched in late [[2000]]. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first [[GeForce 3]] products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the '''GeForce 2 Pro''' and the '''GeForce 2 Ti''' (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end [[GeForce 3]], which lacked a mass-market version. |
||
Line 12: | Line 12: | ||
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older '''TNT2''' cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. |
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older '''TNT2''' cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. |
||
The GeForce 2 MX's successor was the [[GeForce 4]] MX. Although many disappointed enthusiasts described the |
The GeForce 2 MX's successor was the [[GeForce 4]] MX. Although many disappointed enthusiasts described the GeForce 4 MX as a GeForce 2 MX with a better (128-bit DDR) memory controller, the GeForce 4 MX also owes a good deal of its design heritage to Nvidia's high-end CAD products. As a result, the GeForce 4 MX (with the exception of the MX420) efficiently achieves performance similar or better than the "brute-force" GeForce 2 Ultra. |
||
=== Models === |
=== Models === |
Revision as of 03:51, 13 September 2005
The GeForce 2 (codenamed NV15) was the second generation of GeForce graphics cards by NVIDIA Corporation.
The first model, GeForce 2 GTS, was named for its texels rate of 1.6 billion per second - GigaTexel Shader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200MHz vs 120MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the GeForce 256 (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called HDVP (high definition video processor.) HDVP supported motion video-playback at HDTV-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU.
In 3D benchmarks and gaming applications, the GTS outperformed its predecessor (GeForce256) by up to 30%. (The GTS was only 10% faster than the DDR-SDRAM GeForce.) In OpenGL games (such as Quake III), the GTS outperformed the ATI Radeon and 3DFX Voodoo 5 cards in both 16bpp and 32bpp (true-color) display modes. But in Direct3D games, the Radeon was sometimes able to take the lead in 32-bit colour modes. This was attributed to the Radeon's faster memory and superior memory controller. The state of the PC gaming software, and the newness of DirectX 7 (API), likely limited the amount of game software able to take advantage hardware multitexturing capabilities. Most games emphasized single-layer artwork, which would not benefit from multitexturing hardware found in the GeForce 2 or Radeon.
There were three more revisions on the GeForce 2 GTS core - the first was the GeForce 2 Ultra, launched in late 2000. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. It put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first GeForce 3 products. (The GeForce 3 Ti500 in late 2001 finally overtook the GeForce 2 Ultra.) Some speculate the Ultra was intended to defeat 3DFX's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. The other two GTS revisions were the GeForce 2 Pro and the GeForce 2 Ti (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end GeForce 3, which lacked a mass-market version.
Finally, the most successful GeForce 2 part was budget-model GeForce 2 MX. In terms of sales volume, no graphics processor before or since has matched the GeForce 2 MX (and its variants), partly due to the fact that the GeForce 3 never had a budget model, meaning that the GeForce 2 MX family was NVIDIA's mainsteam processor for two and a half years, compared to the more usual one year. The MX retained the GTS's core 3D-architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support for the MX. (The GTS and subsequent models could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.) The GeForce 2's efficient design - 4 watts of power consumption versus the 8 watts for the GeForce 2 GTS and 16 required for the old GeForce 256 - enabled it to be easily adopted for the laptop market as the GeForce 2 Go. ATI's competing Radeon VE (later Radeon 7000) didn't offer hardware T&L while the Radeon SDR was released late and was still too expensive. In addition to being released early and achieving the best price/performance ration, the MX and the rest of the GeForce 2 line were backed by a single reliable driver unlike ATI whose products suffered from unreliable drivers.
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older TNT2 cards. NVidia eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200) versions. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR.
The GeForce 2 MX's successor was the GeForce 4 MX. Although many disappointed enthusiasts described the GeForce 4 MX as a GeForce 2 MX with a better (128-bit DDR) memory controller, the GeForce 4 MX also owes a good deal of its design heritage to Nvidia's high-end CAD products. As a result, the GeForce 4 MX (with the exception of the MX420) efficiently achieves performance similar or better than the "brute-force" GeForce 2 Ultra.
Models
(Performance ranking, slowest to fastest)
- GeForce 2 MX200
- GeForce 2 MX
- GeForce 2 MX400
- GeForce 2 GTS
- GeForce 2 Pro
- GeForce 2 Ti
- GeForce 2 Ultra
Chip table
GeForce2 Chip | Triangles per second (Million) | Pixels Per Second (Gigapixels) | Memory Bandwidth (GB/s) |
---|---|---|---|
Ultra | 31 | 1.0 | 7.3 |
Ti | 31 | 1.0 | 6.4 |
PRO | 25 | 0.8 | 6.4 |
GTS | 25 | 0.8 | 5.3 |