GeForce 2 series: Difference between revisions
grammar, more formal |
→Models: -{{Vert header| |
||
(309 intermediate revisions by more than 100 users not shown) | |||
Line 1: | Line 1: | ||
{{hatnote|For GeForce cards with a model number of 2X0, see [[GeForce 200 series]]. For GeForce cards with a model number of 20X0, see [[GeForce 20 series]].}} |
|||
{{Use mdy dates|date=October 2018}} |
|||
{{Short description|Series of GPUs by Nvidia}} |
|||
{{Infobox GPU |
|||
| name = GeForce 2 series |
|||
| image = [[File:Geforce2logo.png]]<br/>[[File:NVIDIA@180nm@Fixed-pipeline@NV15@GeForce2 GTS@F30213.01 0031A3 DSC02339 (26941631266).jpg|300px]] |
|||
| caption = '''Top''': Logo of the GeForce 2 series<br/>'''Bottom''': Nvidia GeForce 2 GTS ([[Asus]] Branded) with its cooler removed, showing the NV15 die |
|||
| codename = NV11, NV15, NV16 |
|||
| created = mid-May, {{start date and age|2000}}<ref>{{cite web |author=Ross, Alex |url=http://www.sharkyextreme.com/hardware/articles/nvidia_geforce2_gts_guide/26.shtml |title=NVIDIA GeForce2 GTS Guide |publisher=SharkyExtreme |date=April 26, 2000|archive-url=https://web.archive.org/web/20040823113943/http://www.sharkyextreme.com/hardware/articles/nvidia_geforce2_gts_guide/26.shtml |archive-date=August 23, 2004 }}</ref> |
|||
| model1 = GeForce MX series |
|||
| model2 = GeForce GTS series |
|||
| model3 = GeForce Pro series |
|||
| model4 = GeForce Ti series |
|||
| model5 = GeForce Ultra series |
|||
| entry = MX |
|||
| midrange = GTS, Pro |
|||
| highend = Ti, Ultra |
|||
| openglversion = [[OpenGL#OpenGL 1.2|OpenGL 1.2]] ([[Transform, clipping, and lighting|T&L]]) |
|||
| d3dversion = [[Microsoft Direct3D#Direct3D 7.0|Direct3D 7.0]] |
|||
| predecessor = [[GeForce 256]] |
|||
| successor = [[GeForce 3 series]] |
|||
| support status = Unsupported |
|||
The first model was the '''GeForce 2 GTS''' (GigaTexel Shader, owing to the fact that it could process 1.6 billion texels per second, compared to 480 million texels on [[GeForce 256]]), and the main difference between it and the GeForce 256/DDR was the addition of a second texturing unit, thus increasing the texel fill rate to 3.3 times that of the initial GeForce. It was also clocked considerably higher than the previous generation GeForce cards - 200MHz, compared to 120MHz. |
|||
|architecture=[[Celsius (microarchitecture)|Celsius]]}} |
|||
Competitively, in OpenGL games (such as [[Quake III]]) it outperformed the [[Radeon]], [[Voodoo 4]] and [[Voodoo 5]] cards in all modes, but in DirectX games the Radeon was sometimes able to take the lead in 32-bit colour modes. This was attributed to the Radeon having slightly faster memory, and a better memory controller. Compared to the previous GeForce model, it was around 30% faster than the GeForce 256, but only around 10% faster than the GeForce DDR. This is generally thought to be because most games at the time were only using single-layer textures (which would not benefit from the second texturing unit on the GeForce 2). |
|||
The '''GeForce 2 series''' (NV15) is the second generation of [[Nvidia]]'s [[GeForce]] line of [[graphics processing unit]]s (GPUs). Introduced in 2000, it is the successor to the [[GeForce 256]]. |
|||
The GeForce 2 family comprised a number of models: ''GeForce 2 GTS'', ''GeForce 2 Pro'', ''GeForce 2 Ultra'', ''GeForce 2 Ti'', ''GeForce 2 Go'' and the ''GeForce 2 MX'' series. In addition, the GeForce 2 architecture is used for the [[Nvidia Quadro|Quadro]] series on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to accelerate [[computer-aided design]] applications. |
|||
There were three more revisions on the GeForce 2 GTS core - the first was the '''GeForce 2 Ultra''', launched in late [[2000]]. This was basically a GeForce 2 GTS with much higher core and memory speeds, and put a definite lead between it and the Radeon and Voodoo 5. Some say it was intended to prevent [[3dfx]] taking the lead with their Voodoo 5 6000 card, but in the event it would not have been necessary, as the Voodoo 5 6000 was never launched (though later tests showed that the Ultra most certainly did outperform the Voodoo 5 6000). Owing to its high speeds, even the [[GeForce 3]] did not initially outperform it, and it was not until the release of the GeForce 3 Ti500 in late 2001 that the GeForce 2 Ultra finally ceded its performance lead. |
|||
==Architecture== |
|||
The other two revisons were the '''GeForce 2 Pro''', and the '''GeForce 2 Ti''' (for "titanium"). These were clocked at middling points between the GTS and Ultra cards, and intended to provide cheaper alternatives to the [[GeForce 3]], which never had a mass-market mainstream version. |
|||
[[Image:GeForce2 Ultra GPU.jpg|thumb|GeForce2 Ultra GPU]] |
|||
[[File:NVIDIA@180nm@Fixed-pipeline@NV15@GeForce2 GTS@F30213.01 0031A3 Stack-DSC03235-DSC03247 - ZS-DMap (26370843253).jpg|thumb|Die shot of a Geforce 2 GPU]] |
|||
The GeForce 2 architecture is similar to the previous GeForce 256 line but with various improvements. Compared to the 220 [[nanometre|nm]] GeForce 256, the GeForce 2 is built on a 180 nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a second [[texture mapping unit]] to each of the four [[Graphics pipeline#Scan conversion or rasterization|pixel pipeline]]s. Some say{{who|date=January 2013}} the second TMU was there in the original Geforce NSR (Nvidia Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texture [[fillrate]] per clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixel [[shader]]s. This functionality is also present in GeForce 256 but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, called ''HDVP'' (high definition video processor). HDVP supports motion video playback at [[High-definition television|HDTV]]-resolutions (MP@HL).<ref>{{cite web |author=Lal Shimpi, Anand |url=http://www.anandtech.com/showdoc.aspx?i=1231&p=2 |title=NVIDIA GeForce 2 GTS |publisher=Anandtech |date=April 26, 2000 |pages=2 |access-date=July 2, 2009}}</ref> |
|||
In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%.<ref>{{cite web |author=Lal Shimpi, Anand |url=http://www.anandtech.com/showdoc.aspx?i=1231&p=10 |title=NVIDIA GeForce 2 GTS |publisher=Anandtech |date=April 26, 2000 |access-date=June 14, 2008}}</ref> In [[OpenGL]] games (such as [[Quake III Arena|Quake III]]), the card outperforms the [[Radeon R100|ATI Radeon DDR]] and [[3dfx]] [[Voodoo 5|Voodoo 5 5500]] cards in both 16 [[color depth|bpp]] and 32 bpp display modes. However, in [[Microsoft Direct3D|Direct3D]] games running 32 bpp, the Radeon DDR is sometimes able to take the lead.<ref>{{cite web |author=Witheiler, Matthew |url=http://www.anandtech.com/showdoc.aspx?i=1281 |title=ATI Radeon 64MB DDR |publisher=Anandtech |date=July 17, 2000 |access-date=June 14, 2008}}</ref> |
|||
However, Nvidia's real success story was with the '''GeForce 2 MX''' cards. These were cut down versions of the GeForce 2 GTS, with dual-monitor support (while there was a partial dual-monitor mode in the GTS, the monitors weren't independent, and would show the same image). |
|||
The GeForce 2 architecture is quite memory bandwidth constrained.<ref>{{cite web |author=Lal Shimpi, Anand |url=http://www.anandtech.com/showdoc.aspx?i=1298&p=2 |title=NVIDIA GeForce 2 Ultra |publisher=Anandtech |date=August 14, 2000 |access-date=June 14, 2008}}</ref> The GPU wastes memory bandwidth and pixel fillrate due to unoptimized [[z-buffer]] usage, drawing of [[Hidden surface determination|hidden surfaces]], and a relatively inefficient RAM controller. The main competition for GeForce 2, the ATI Radeon DDR, has hardware functions (called [[HyperZ]]) that address these issues.<ref>{{cite web |author=Lal Shimpi, Anand |url=http://www.anandtech.com/showdoc.aspx?i=1230&p=5 |title=ATI Radeon 256 Preview (HyperZ) |publisher=Anandtech |date=April 25, 2000 |pages=5 |access-date=June 14, 2008}}</ref> Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design used in the [[GeForce 4 series#GeForce4 MX|GeForce4 MX]] was more efficient. |
|||
The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions), supplanting the older TNT2 cards. There were two later versions - the low-performance MX200, which had a crippled 64-bit memory bus and was suitable only for office work, and the MX400, which was a slightly faster MX. No graphics processor before or since has matched the GeForce 2 MX (and its variants) in sales. |
|||
=== |
===Releases=== |
||
The first models to arrive after the original GeForce 2 GTS was the ''GeForce 2 Ultra'' and ''GeForce2 MX'', launched on September 7, 2000.<ref>{{cite web|url=http://www.nvidia.com/object/IO_20010618_6038.html|title=Press Release-NVIDIA|website=www.nvidia.com|access-date=April 22, 2018}}</ref> On September 29, 2000 Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size. |
|||
Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. The Ultra model actually outperforms the first [[GeForce 3]] products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX 8.0 games. |
|||
(Performance ranking, slowest to fastest) |
|||
The ''GeForce 2 Pro'', introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS. |
|||
* GeForce 2 MX200 |
|||
* GeForce 2 MX |
|||
In October 2001, the ''GeForce 2 Ti'' was positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against the [[Radeon R100|Radeon 7500]], although the 7500 had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by the [[GeForce 4 series#GeForce4 MX|GeForce4 MX]] series as the budget/performance choice in January 2002. |
|||
* GeForce 2 MX400 |
|||
* GeForce 2 GTS |
|||
On their 2001 product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late 2002 with the GeForce 2 considered a discontinued product line, the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page. |
|||
* GeForce 2 Pro |
|||
* GeForce 2 Ti |
|||
==GeForce 2 MX== |
|||
[[Image:GeForce2MX200AGP.JPG|thumb|GeForce 2 MX200 AGP]] |
|||
[[File:NVIDIA@180nm@Fixed-pipeline@NV11@GeForce2 MX400@Q26257.1 0315B3 Stack-DSC01049-DSC01078 - ZS-DMap (26336569015).jpg|thumb|Die shot of the MX400 GPU]] |
|||
Since the previous GeForce 256 line shipped without a budget variant, the [[RIVA TNT2]] series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, Nvidia created the GeForce 2 MX series, which offered a set of standard features, specific to the entire GeForce 2 generation, limited only by categorical tier. The GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from 32 to 128 bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the regular GeForce 256 and GeForce 2. |
|||
The prime competitors to the GeForce 2 MX series were ATI's [[Radeon R100|Radeon VE / 7000]] and [[Radeon R100|Radeon SDR]] (which with the other R100's was later renamed as part of the 7200 series). The Radeon VE had the advantage of somewhat better dual-monitor display software, but it did not offer hardware [[Transform and lighting|T&L]], an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. The Radeon SDR, equipped with SDR SDRAM instead of DDR SDRAM found in more expensive brethren, was released some time later, and exhibited faster 32-bit 3D rendering than the GeForce 2 MX.<ref>{{cite web |author=FastSite |url=http://www.xbitlabs.com/articles/video/display/ati-radeon-sdr.html |title=ATI RADEON 32MB SDR Review |publisher=X-bit labs |date=December 27, 2000 |access-date=June 14, 2008 |archive-url=https://web.archive.org/web/20080725085837/http://www.xbitlabs.com/articles/video/display/ati-radeon-sdr.html |archive-date=July 25, 2008 |url-status=dead }}</ref> However, the Radeon SDR lacked multi-monitor support and debuted at a considerable higher price point than the GeForce 2 MX. 3dfx's Voodoo4 4500 arrived too late, as well as being too expensive, but too slow to compete with the GeForce 2 MX. |
|||
Members of the series include ''GeForce 2 MX'', ''MX400'', ''MX200'', and ''MX100''. The GPU was also used as an integrated graphics processor in the [[nForce]] chipset line and as a mobile graphics chip for notebooks called ''GeForce 2 Go''. |
|||
==Successor== |
|||
The successor to the GeForce 2 (non-MX) line is the [[GeForce 3 series|GeForce 3]]. The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product. |
|||
Later, the entire GeForce 2 line was replaced with the [[GeForce 4 series#GeForce4 MX|GeForce4 MX]]. |
|||
== Models == |
|||
{{main|List of Nvidia graphics processing units#GeForce2_series}} |
|||
{{Further|Celsius (microarchitecture)}} |
|||
* All models support TwinView Dual-Display Architecture, Second Generation Transform and Lighting (T&L) |
|||
* GeForce2 MX models support Digital Vibrance Control (DVC) |
|||
{{Row hover highlight}} |
|||
{| class="mw-datatable wikitable sortable" style="font-size:85%; text-align:center;" |
|||
|- |
|||
! rowspan="2" style="vertical-align: bottom"|Model |
|||
! rowspan="2" style="vertical-align: bottom"|Launch |
|||
! rowspan="2" {{Vert header|[[Code name]]}} |
|||
! rowspan="2" style="vertical-align: bottom" {{Vert header|[[Semiconductor device fabrication|Fab]] ([[nanometer|nm]])<ref name="vintage3d">{{cite web |title=3D accelerator database |url=http://vintage3d.org/dbn.php |website=Vintage 3D |access-date=30 August 2024 |archive-url=https://web.archive.org/web/20181023222614/http://www.vintage3d.org/dbn.php |archive-date=23 October 2018 |url-status=live }}</ref>}} |
|||
! rowspan="2" {{Vert header|Transistors (million)}} |
|||
! rowspan="2" {{Vert header|Die size (mm<sup>2</sup>)}} |
|||
! rowspan="2" {{Vert header|[[Computer bus|Bus]] [[I/O interface|interface]]}} |
|||
! rowspan="2" {{Vert header|Core clock ([[Hertz|MHz]])}} |
|||
! rowspan="2" {{Vert header|Memory clock ([[Hertz|MHz]])}} |
|||
! rowspan="2" {{Vert header|Core config{{efn|name=geforce 2 1|[[Pixel pipeline]]s: [[texture mapping unit]]s: [[render output unit]]s}}}} |
|||
! colspan="4" |[[Fillrate]] |
|||
! colspan="4" |Memory |
|||
! rowspan="2" {{Vert header|Performance ([[FLOPS|GFLOPS]]<br/>[[FP32]])}} |
|||
! rowspan="2" {{Vert header|TDP (Watts)}} |
|||
|- |
|||
!{{Vert header|MOperations/s}} |
|||
!{{Vert header|MPixels/s}} |
|||
!{{Vert header|MTexels/s}} |
|||
!{{Vert header|MVertices/s}} |
|||
!{{Vert header|Size ([[Megabyte|MB]])}} |
|||
!{{Vert header|Bandwidth ([[Gigabyte|GB]]/s)}} |
|||
!{{Vert header|Bus type}} |
|||
!{{Vert header|Bus width ([[bit]])}} |
|||
|- |
|||
! style="text-align:left" |GeForce2 MX IGP + nForce 220/420 |
|||
|June 4, 2001 |
|||
| rowspan="4" |NV1A (IGP) / NV11 (MX) |
|||
| rowspan="6" |[[TSMC]]<br />[[180 nm]] |
|||
| rowspan="4" |20<ref>{{cite web|title=NVIDIA GeForce2 MX PCI Specs|url=https://www.techpowerup.com/gpu-specs/geforce2-mx-pci.c791|access-date=2024-08-30|website=TechPowerUp|language=en}}</ref> |
|||
| rowspan="4" |64 |
|||
|FSB |
|||
| rowspan="3" |175 |
|||
|133 |
|||
| rowspan="4" |2:4:2 |
|||
| rowspan="3" |350 |
|||
| rowspan="3" |350 |
|||
| rowspan="3" |700 |
|||
| rowspan="8" |0 |
|||
|Up to 32 system RAM |
|||
|2.128<br />4.256 |
|||
|DDR |
|||
|64<br />128 |
|||
|rowspan="3"|0.700 |
|||
|3 |
|||
|- |
|||
! style="text-align:left" |GeForce2 MX200 |
|||
|March 3, 2001 |
|||
| rowspan="3" |{{nowrap|AGP 4x}}, PCI |
|||
| rowspan="2" |166 |
|||
| rowspan="6" |32<br />64 |
|||
|1.328 |
|||
| rowspan="2" |SDR |
|||
|64 |
|||
|1 |
|||
|- |
|||
! style="text-align:left" |GeForce2 MX |
|||
|June 28, 2000 |
|||
|2.656 |
|||
|128 |
|||
|4 |
|||
|- |
|||
! style="text-align:left" |GeForce2 MX400 |
|||
|March 3, 2001 |
|||
| rowspan="3" |200 |
|||
|166,200 (SDR)<br />166 (DDR) |
|||
|400 |
|||
|400 |
|||
|800 |
|||
|1.328 3.200 2.656 |
|||
|SDR<br />DDR |
|||
|64/128 (SDR)<br />64 (DDR) |
|||
|0.800 |
|||
|5 |
|||
|- |
|||
! style="text-align:left" |GeForce2 GTS |
|||
|April 26, 2000 |
|||
| rowspan="4" |NV15 |
|||
| rowspan="4" |25<ref>{{Cite web|url=https://www.techpowerup.com/gpu-specs/nvidia-nv15.g339|title=NVIDIA NV15 GPU Specs | TechPowerUp GPU Database|accessdate=30 August 2024}}</ref> |
|||
| rowspan="4" |88 |
|||
| rowspan="4" |AGP 4x |
|||
|166 |
|||
| rowspan="4" |4:8:4 |
|||
| rowspan="2" |800 |
|||
| rowspan="2" |800 |
|||
| rowspan="2" |1,600 |
|||
|5.312 |
|||
|rowspan="4" |DDR |
|||
|rowspan="4" |128 |
|||
|rowspan="2"|1.600 |
|||
|6 |
|||
|- |
|||
! style="text-align:left" |GeForce2 Pro |
|||
|December 5, 2000 |
|||
| rowspan="2" |200 |
|||
| rowspan="2" |6.4 |
|||
|? |
|||
|- |
|||
! style="text-align:left" |GeForce2 Ti |
|||
|October 1, 2001 |
|||
|TSMC<br />150 nm |
|||
|rowspan="2" |250 |
|||
|rowspan="2" |1,000 |
|||
|rowspan="2" |1,000 |
|||
|rowspan="2" |2,000 |
|||
|rowspan="2"|2.000 |
|||
|? |
|||
|- |
|||
! style="text-align:left" |GeForce2 Ultra |
|||
|August 14, 2000 |
|||
|TSMC<br />180 nm |
|||
|230 |
|||
|64 |
|||
|7.36 |
|||
|? |
|||
|} |
|||
{{notelist}} |
|||
===GeForce2 Go mobile GPU series=== |
|||
* Mobile GPUs are either soldered to the mainboard or to some [[Mobile PCI Express Module]] (MXM). |
|||
* All models are manufactured with a 180 nm manufacturing process |
|||
{{Row hover highlight}} |
|||
{| class="mw-datatable wikitable sortable" style="font-size:85%; text-align:center;" |
|||
|- |
|||
!rowspan=2|Model |
|||
!rowspan=2|Launch |
|||
!rowspan=2 {{Vert header|[[Code name]]}} |
|||
!rowspan=2 {{Vert header|[[Computer bus|Bus]] [[I/O interface|interface]]}} |
|||
!rowspan=2 {{Vert header|Core clock ([[Hertz|MHz]])}} |
|||
!rowspan=2 {{Vert header|Memory clock ([[Hertz|MHz]])}} |
|||
!rowspan=2 {{Vert header|Core config{{efn|name=GF2GCoreConfig}}}} |
|||
!colspan=4|[[Fillrate]] |
|||
!colspan=4|Memory |
|||
|- |
|||
!{{Vert header|MOperations/s}} |
|||
!{{Vert header|MPixels/s}} |
|||
!{{Vert header|MTexels/s}} |
|||
!{{Vert header|MVertices/s}} |
|||
!{{Vert header|Size ([[Megabyte|MB]])}} |
|||
!{{Vert header|Bandwidth ([[Gigabyte|GB]]/s)}} |
|||
!{{Vert header|Bus type}} |
|||
!{{Vert header|Bus width ([[bit]])}} |
|||
|- |
|||
!style="text-align:left"|GeForce2 Go 100 |
|||
|February 6, 2001 |
|||
| rowspan="3" |NV11M |
|||
| rowspan="3" |AGP 4x |
|||
|125 |
|||
|332 |
|||
| rowspan="3" |2:0:4:2 |
|||
|250 |
|||
|250 |
|||
|500 |
|||
| rowspan="3" |0 |
|||
|8, 16 |
|||
|1.328 |
|||
|DDR |
|||
|32 |
|||
|- |
|||
!style="text-align:left"|GeForce2 Go |
|||
|November 11, 2000 |
|||
| rowspan="2" |143 |
|||
|166<br />332 |
|||
| rowspan="2" |286 |
|||
| rowspan="2" |286 |
|||
| rowspan="2" |572 |
|||
| rowspan="2" |16, 32 |
|||
| rowspan="2" |2.656 |
|||
|SDR<br />DDR |
|||
|128<br />64 |
|||
|- |
|||
!style="text-align:left"|GeForce2 Go 200 |
|||
|February 6, 2001 |
|||
|332 |
|||
|DDR |
|||
|64 |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=GF2GCoreConfig|[[Pixel shader]]s: [[vertex shader]]s: [[texture mapping unit]]s: [[render output unit]]s}} |
|||
}} |
|||
== Discontinued support == |
|||
[[File:NVIDIA GeForce2 Ultra.jpg|thumb|Nvidia GeForce2 Ultra]] |
|||
Nvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in 2005 and then with MX models in 2007. |
|||
===Final drivers=== |
|||
'''GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:''' |
|||
* Windows 9x & Windows Me: 71.84 released on March 11, 2005; [https://www.nvidia.com/Download/driverResults.aspx/12/en-us/ Download]; [https://web.archive.org/web/20190723034006/http://www.nvidia.com/object/71.84_9x_supported.html Product Support List Windows 95/98/Me – 71.84]. |
|||
* Windows 2000 & 32-bit Windows XP: 71.89 released on April 14, 2005; [http://www.nvidia.com/object/winxp_2k_71.89.html Download]; [https://web.archive.org/web/20190722160740/http://www.nvidia.com/object/71.84_geforcetnt2quadro_supported.html Product Support List Windows XP/2000 - 71.84]. |
|||
* Linux 32-bit: 71.86.15 released on August 17, 2011; [https://www.nvidia.com/Download/driverResults.aspx/35626/en-us/ Download] |
|||
'''GeForce 2 MX & MX x00 Series:''' |
|||
* Windows 9x & Windows Me: 81.98 released on December 21, 2005; [http://www.nvidia.com/object/win9x_81.98.html Download]; [https://web.archive.org/web/20181016125108/http://www.nvidia.com/object/81.98_9x_supported.html Product Support List Windows 95/98/Me – 81.98]. |
|||
** Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems. No new official releases were later made for these systems. |
|||
* Windows 2000, 32-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006; [https://www.nvidia.com/download/driverResults.aspx/5/ Download]. (Products supported list also on this page) |
|||
** For Windows 2000, 32-bit Windows XP & Media Center Edition also available beta driver 93.81 released on November 28, 2006; [http://www.geforce.com/Drivers/Results/585 ForceWare Release 90 Version 93.81 - BETA]. |
|||
* Linux 32-bit: 96.43.23 released on September 14, 2012; [https://www.nvidia.com/Download/driverResults.aspx/48996/en-us/ Download] |
|||
The drivers for Windows 2000/XP may be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the [[Windows Aero|Aero]] effects of these operating systems. |
|||
[http://www.nvidia.com/object/win9x_archive.html Windows 95/98/Me Driver Archive]<br/> |
|||
[http://www.nvidia.com/object/winxp-2k_archive.html Windows XP/2000 Driver Archive] |
|||
<!-- Corrected most links to the latest available drivers for all the old GeForce2 series. Also added links for separate GeForce2 and GeForce2 MX series. Please do not delete or change any of the links. All have been double checked and all are correct. Mr-Encyclopedia-Man, May 11, 2009 --> |
|||
== Competing chipsets == |
|||
*[[3dfx]] [[Voodoo 5]] |
|||
*[[ATI Technologies|ATI]] [[Radeon R100|Radeon]] |
|||
*[[PowerVR#Series3 (STMicro)|PowerVR Series 3]] (Kyro) |
|||
== See also == |
|||
*[[Graphics card]] |
|||
*[[Graphics processing unit]] |
|||
== References == |
|||
{{Reflist}} |
|||
== External links == |
|||
{{Commons category|Nvidia GeForce 2 series video cards|GeForce 2 series}} |
|||
* [https://web.archive.org/web/20180808195422/http://www.nvidia.com/page/geforce2.html Nvidia: GeForce2 leading-edge technology] |
|||
* [https://web.archive.org/web/20180706023054/http://www.nvidia.com/page/geforce2go.html Nvidia: GeForce2 Go] |
|||
* [https://web.archive.org/web/20190808195333/http://www.nvidia.com/object/win9x_71.84.html ForceWare 71.84 drivers, Final Windows 9x/ME driver release for GeForce 2 GTS/Pro/Ti/Ultra] |
|||
* [http://www.nvidia.com/object/win9x_81.98.html ForceWare 81.98 drivers, Final Windows 9x/ME driver release for GeForce 2 MX series] |
|||
* [http://www.nvidia.com/object/winxp_2k_71.89.html ForceWare 71.89 drivers, Final Windows XP driver release for GeForce 2 GTS/Pro/Ti/Ultra] |
|||
* [https://web.archive.org/web/20200704183348/http://www.nvidia.com/object/winxp_2k_94.24_2.html ForceWare 94.24 drivers, Final Windows XP driver release for GeForce 2 MX series] |
|||
* [http://www.tomshardware.com/reviews/vga-charts-i,453.html Tom's Hardware VGA Charts (w/ GF2)] |
|||
* [http://www.techpowerup.com/gpudb techPowerUp! GPU Database] |
|||
{{Nvidia}} |
|||
{{DEFAULTSORT:Geforce 2 series}} |
|||
[[Category:Computer-related introductions in 2000]] |
|||
[[Category:GeForce series|2 series]] |
|||
[[Category:Graphics cards]] |
Latest revision as of 09:48, 31 August 2024
Release date | mid-May, 2000[1] |
---|---|
Codename | NV11, NV15, NV16 |
Architecture | Celsius |
Models |
|
Cards | |
Entry-level | MX |
Mid-range | GTS, Pro |
High-end | Ti, Ultra |
API support | |
DirectX | Direct3D 7.0 |
OpenGL | OpenGL 1.2 (T&L) |
History | |
Predecessor | GeForce 256 |
Successor | GeForce 3 series |
Support status | |
Unsupported |
The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.
The GeForce 2 family comprised a number of models: GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ultra, GeForce 2 Ti, GeForce 2 Go and the GeForce 2 MX series. In addition, the GeForce 2 architecture is used for the Quadro series on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to accelerate computer-aided design applications.
Architecture
[edit]The GeForce 2 architecture is similar to the previous GeForce 256 line but with various improvements. Compared to the 220 nm GeForce 256, the GeForce 2 is built on a 180 nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a second texture mapping unit to each of the four pixel pipelines. Some say[who?] the second TMU was there in the original Geforce NSR (Nvidia Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texture fillrate per clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixel shaders. This functionality is also present in GeForce 256 but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, called HDVP (high definition video processor). HDVP supports motion video playback at HDTV-resolutions (MP@HL).[2]
In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%.[3] In OpenGL games (such as Quake III), the card outperforms the ATI Radeon DDR and 3dfx Voodoo 5 5500 cards in both 16 bpp and 32 bpp display modes. However, in Direct3D games running 32 bpp, the Radeon DDR is sometimes able to take the lead.[4]
The GeForce 2 architecture is quite memory bandwidth constrained.[5] The GPU wastes memory bandwidth and pixel fillrate due to unoptimized z-buffer usage, drawing of hidden surfaces, and a relatively inefficient RAM controller. The main competition for GeForce 2, the ATI Radeon DDR, has hardware functions (called HyperZ) that address these issues.[6] Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design used in the GeForce4 MX was more efficient.
Releases
[edit]The first models to arrive after the original GeForce 2 GTS was the GeForce 2 Ultra and GeForce2 MX, launched on September 7, 2000.[7] On September 29, 2000 Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size.
Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. The Ultra model actually outperforms the first GeForce 3 products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX 8.0 games.
The GeForce 2 Pro, introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS.
In October 2001, the GeForce 2 Ti was positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against the Radeon 7500, although the 7500 had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by the GeForce4 MX series as the budget/performance choice in January 2002.
On their 2001 product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late 2002 with the GeForce 2 considered a discontinued product line, the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page.
GeForce 2 MX
[edit]Since the previous GeForce 256 line shipped without a budget variant, the RIVA TNT2 series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, Nvidia created the GeForce 2 MX series, which offered a set of standard features, specific to the entire GeForce 2 generation, limited only by categorical tier. The GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from 32 to 128 bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the regular GeForce 256 and GeForce 2.
The prime competitors to the GeForce 2 MX series were ATI's Radeon VE / 7000 and Radeon SDR (which with the other R100's was later renamed as part of the 7200 series). The Radeon VE had the advantage of somewhat better dual-monitor display software, but it did not offer hardware T&L, an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. The Radeon SDR, equipped with SDR SDRAM instead of DDR SDRAM found in more expensive brethren, was released some time later, and exhibited faster 32-bit 3D rendering than the GeForce 2 MX.[8] However, the Radeon SDR lacked multi-monitor support and debuted at a considerable higher price point than the GeForce 2 MX. 3dfx's Voodoo4 4500 arrived too late, as well as being too expensive, but too slow to compete with the GeForce 2 MX.
Members of the series include GeForce 2 MX, MX400, MX200, and MX100. The GPU was also used as an integrated graphics processor in the nForce chipset line and as a mobile graphics chip for notebooks called GeForce 2 Go.
Successor
[edit]The successor to the GeForce 2 (non-MX) line is the GeForce 3. The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product.
Later, the entire GeForce 2 line was replaced with the GeForce4 MX.
Models
[edit]- All models support TwinView Dual-Display Architecture, Second Generation Transform and Lighting (T&L)
- GeForce2 MX models support Digital Vibrance Control (DVC)
Model | Launch | Transistors (million)
|
Die size (mm2)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | TDP (Watts)
| ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| ||||||||||||
GeForce2 MX IGP + nForce 220/420 | June 4, 2001 | NV1A (IGP) / NV11 (MX) | TSMC 180 nm |
20[10] | 64 | FSB | 175 | 133 | 2:4:2 | 350 | 350 | 700 | 0 | Up to 32 system RAM | 2.128 4.256 |
DDR | 64 128 |
0.700 | 3 |
GeForce2 MX200 | March 3, 2001 | AGP 4x, PCI | 166 | 32 64 |
1.328 | SDR | 64 | 1 | |||||||||||
GeForce2 MX | June 28, 2000 | 2.656 | 128 | 4 | |||||||||||||||
GeForce2 MX400 | March 3, 2001 | 200 | 166,200 (SDR) 166 (DDR) |
400 | 400 | 800 | 1.328 3.200 2.656 | SDR DDR |
64/128 (SDR) 64 (DDR) |
0.800 | 5 | ||||||||
GeForce2 GTS | April 26, 2000 | NV15 | 25[11] | 88 | AGP 4x | 166 | 4:8:4 | 800 | 800 | 1,600 | 5.312 | DDR | 128 | 1.600 | 6 | ||||
GeForce2 Pro | December 5, 2000 | 200 | 6.4 | ? | |||||||||||||||
GeForce2 Ti | October 1, 2001 | TSMC 150 nm |
250 | 1,000 | 1,000 | 2,000 | 2.000 | ? | |||||||||||
GeForce2 Ultra | August 14, 2000 | TSMC 180 nm |
230 | 64 | 7.36 | ? |
GeForce2 Go mobile GPU series
[edit]- Mobile GPUs are either soldered to the mainboard or to some Mobile PCI Express Module (MXM).
- All models are manufactured with a 180 nm manufacturing process
Model | Launch | Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| |||||||
GeForce2 Go 100 | February 6, 2001 | NV11M | AGP 4x | 125 | 332 | 2:0:4:2 | 250 | 250 | 500 | 0 | 8, 16 | 1.328 | DDR | 32 |
GeForce2 Go | November 11, 2000 | 143 | 166 332 |
286 | 286 | 572 | 16, 32 | 2.656 | SDR DDR |
128 64 | ||||
GeForce2 Go 200 | February 6, 2001 | 332 | DDR | 64 |
Discontinued support
[edit]Nvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in 2005 and then with MX models in 2007.
Final drivers
[edit]GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:
- Windows 9x & Windows Me: 71.84 released on March 11, 2005; Download; Product Support List Windows 95/98/Me – 71.84.
- Windows 2000 & 32-bit Windows XP: 71.89 released on April 14, 2005; Download; Product Support List Windows XP/2000 - 71.84.
- Linux 32-bit: 71.86.15 released on August 17, 2011; Download
GeForce 2 MX & MX x00 Series:
- Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download; Product Support List Windows 95/98/Me – 81.98.
- Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems. No new official releases were later made for these systems.
- Windows 2000, 32-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006; Download. (Products supported list also on this page)
- For Windows 2000, 32-bit Windows XP & Media Center Edition also available beta driver 93.81 released on November 28, 2006; ForceWare Release 90 Version 93.81 - BETA.
- Linux 32-bit: 96.43.23 released on September 14, 2012; Download
The drivers for Windows 2000/XP may be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.
Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
Competing chipsets
[edit]- 3dfx Voodoo 5
- ATI Radeon
- PowerVR Series 3 (Kyro)
See also
[edit]References
[edit]- ^ Ross, Alex (April 26, 2000). "NVIDIA GeForce2 GTS Guide". SharkyExtreme. Archived from the original on August 23, 2004.
- ^ Lal Shimpi, Anand (April 26, 2000). "NVIDIA GeForce 2 GTS". Anandtech. p. 2. Retrieved July 2, 2009.
- ^ Lal Shimpi, Anand (April 26, 2000). "NVIDIA GeForce 2 GTS". Anandtech. Retrieved June 14, 2008.
- ^ Witheiler, Matthew (July 17, 2000). "ATI Radeon 64MB DDR". Anandtech. Retrieved June 14, 2008.
- ^ Lal Shimpi, Anand (August 14, 2000). "NVIDIA GeForce 2 Ultra". Anandtech. Retrieved June 14, 2008.
- ^ Lal Shimpi, Anand (April 25, 2000). "ATI Radeon 256 Preview (HyperZ)". Anandtech. p. 5. Retrieved June 14, 2008.
- ^ "Press Release-NVIDIA". www.nvidia.com. Retrieved April 22, 2018.
- ^ FastSite (December 27, 2000). "ATI RADEON 32MB SDR Review". X-bit labs. Archived from the original on July 25, 2008. Retrieved June 14, 2008.
- ^ "3D accelerator database". Vintage 3D. Archived from the original on October 23, 2018. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce2 MX PCI Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA NV15 GPU Specs | TechPowerUp GPU Database". Retrieved August 30, 2024.
External links
[edit]- Nvidia: GeForce2 leading-edge technology
- Nvidia: GeForce2 Go
- ForceWare 71.84 drivers, Final Windows 9x/ME driver release for GeForce 2 GTS/Pro/Ti/Ultra
- ForceWare 81.98 drivers, Final Windows 9x/ME driver release for GeForce 2 MX series
- ForceWare 71.89 drivers, Final Windows XP driver release for GeForce 2 GTS/Pro/Ti/Ultra
- ForceWare 94.24 drivers, Final Windows XP driver release for GeForce 2 MX series
- Tom's Hardware VGA Charts (w/ GF2)
- techPowerUp! GPU Database