TheGeForce 2 series(NV15) is the second generation ofNvidia'sGeForceline ofgraphics processing units(GPUs). Introduced in 2000, it is the successor to theGeForce 256.
Release date | mid-May, 2000[1] |
---|---|
Codename | NV11, NV15, NV16 |
Architecture | Celsius |
Models |
|
Cards | |
Entry-level | MX |
Mid-range | GTS, Pro |
High-end | Ti, Ultra |
APIsupport | |
DirectX | Direct3D 7.0 |
OpenGL | OpenGL 1.2(T&L) |
History | |
Predecessor | GeForce 256 |
Successor | GeForce 3 series |
Support status | |
Unsupported |
The GeForce 2 family comprised a number of models:GeForce 2 GTS,GeForce 2 Pro,GeForce 2 Ultra,GeForce 2 Ti,GeForce 2 Goand theGeForce 2 MXseries. In addition, the GeForce 2 architecture is used for theQuadroseries on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to acceleratecomputer-aided designapplications.
Architecture
editThe GeForce 2 architecture is similar to the previous GeForce 256 line but with various improvements. Compared to the 220nmGeForce 256, the GeForce 2 is built on a 180 nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a secondtexture mapping unitto each of the fourpixel pipelines.Some say[who?]the second TMU was there in the original Geforce NSR (Nvidia Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texturefillrateper clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixelshaders.This functionality is also present in GeForce 256 but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, calledHDVP(high definition video processor). HDVP supports motion video playback atHDTV-resolutions (MP@HL).[2]
In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%.[3]InOpenGLgames (such asQuake III), the card outperforms theATI Radeon DDRand3dfxVoodoo 5 5500cards in both 16bppand 32 bpp display modes. However, inDirect3Dgames running 32 bpp, the Radeon DDR is sometimes able to take the lead.[4]
The GeForce 2 architecture is quite memory bandwidth constrained.[5]The GPU wastes memory bandwidth and pixel fillrate due to unoptimizedz-bufferusage, drawing ofhidden surfaces,and a relatively inefficient RAM controller. The main competition for GeForce 2, the ATI Radeon DDR, has hardware functions (calledHyperZ) that address these issues.[6]Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design used in theGeForce4 MXwas more efficient.
Releases
editThe first models to arrive after the original GeForce 2 GTS was theGeForce 2 UltraandGeForce2 MX,launched on September 7, 2000.[7]On September 29, 2000 Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size.
Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. The Ultra model actually outperforms the firstGeForce 3products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX 8.0 games.
TheGeForce 2 Pro,introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS.
In October 2001, theGeForce 2 Tiwas positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against theRadeon 7500,although the 7500 had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by theGeForce4 MXseries as the budget/performance choice in January 2002.
On their 2001 product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late 2002 with the GeForce 2 considered a discontinued product line, the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page.
GeForce 2 MX
editSince the previous GeForce 256 line shipped without a budget variant, theRIVA TNT2series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, Nvidia created the GeForce 2 MX series, which offered a set of standard features, specific to the entire GeForce 2 generation, limited only by categorical tier. The GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from 32 to 128 bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the regular GeForce 256 and GeForce 2.
The prime competitors to the GeForce 2 MX series were ATI'sRadeon VE / 7000andRadeon SDR(which with the other R100's was later renamed as part of the 7200 series). The Radeon VE had the advantage of somewhat better dual-monitor display software, but it did not offer hardwareT&L,an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. The Radeon SDR, equipped with SDR SDRAM instead of DDR SDRAM found in more expensive brethren, was released some time later, and exhibited faster 32-bit 3D rendering than the GeForce 2 MX.[8]However, the Radeon SDR lacked multi-monitor support and debuted at a considerable higher price point than the GeForce 2 MX. 3dfx's Voodoo4 4500 arrived too late, as well as being too expensive, but too slow to compete with the GeForce 2 MX.
Members of the series includeGeForce 2 MX,MX400,MX200,andMX100.The GPU was also used as an integrated graphics processor in thenForcechipset line and as a mobile graphics chip for notebooks calledGeForce 2 Go.
Successor
editThe successor to the GeForce 2 (non-MX) line is theGeForce 3.The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product.
Later, the entire GeForce 2 line was replaced with theGeForce4 MX.
Models
edit- All models support TwinView Dual-Display Architecture, Second Generation Transform and Lighting (T&L)
- GeForce2 MX models support Digital Vibrance Control (DVC)
Model | Launch | Transistors (million)
|
Die size (mm2)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | TDP (Watts)
| ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| ||||||||||||
GeForce2 MX IGP + nForce 220/420 | June 4, 2001 | NV1A (IGP) / NV11 (MX) | TSMC 180 nm |
20[10] | 64 | FSB | 175 | 133 | 2:4:2 | 350 | 350 | 700 | 0 | Up to 32 system RAM | 2.128 4.256 |
DDR | 64 128 |
0.700 | 3 |
GeForce2 MX200 | March 3, 2001 | AGP 4x,PCI | 166 | 32 64 |
1.328 | SDR | 64 | 1 | |||||||||||
GeForce2 MX | June 28, 2000 | 2.656 | 128 | 4 | |||||||||||||||
GeForce2 MX400 | March 3, 2001 | 200 | 166,200 (SDR) 166 (DDR) |
400 | 400 | 800 | 1.328 3.200 2.656 | SDR DDR |
64/128 (SDR) 64 (DDR) |
0.800 | 5 | ||||||||
GeForce2 GTS | April 26, 2000 | NV15 | 25[11] | 88 | AGP 4x | 166 | 4:8:4 | 800 | 800 | 1,600 | 5.312 | DDR | 128 | 1.600 | 6 | ||||
GeForce2 Pro | December 5, 2000 | 200 | 6.4 | ? | |||||||||||||||
GeForce2 Ti | October 1, 2001 | TSMC 150 nm |
250 | 1,000 | 1,000 | 2,000 | 2.000 | ? | |||||||||||
GeForce2 Ultra | August 14, 2000 | TSMC 180 nm |
230 | 64 | 7.36 | ? |
GeForce2 Go mobile GPU series
edit- Mobile GPUs are either soldered to the mainboard or to someMobile PCI Express Module(MXM).
- All models are manufactured with a 180 nm manufacturing process
Model | Launch | Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| |||||||
GeForce2 Go 100 | February 6, 2001 | NV11M | AGP 4x | 125 | 332 | 2:0:4:2 | 250 | 250 | 500 | 0 | 8, 16 | 1.328 | DDR | 32 |
GeForce2 Go | November 11, 2000 | 143 | 166 332 |
286 | 286 | 572 | 16, 32 | 2.656 | SDR DDR |
128 64 | ||||
GeForce2 Go 200 | February 6, 2001 | 332 | DDR | 64 |
Discontinued support
editNvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in 2005 and then with MX models in 2007.
Final drivers
editGeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:
- Windows 9x & Windows Me: 71.84 released on March 11, 2005;Download;Product Support List Windows 95/98/Me – 71.84.
- Windows 2000 & 32-bit Windows XP: 71.89 released on April 14, 2005;Download;Product Support List Windows XP/2000 - 71.84.
- Linux 32-bit: 71.86.15 released on August 17, 2011;Download
GeForce 2 MX & MX x00 Series:
- Windows 9x & Windows Me: 81.98 released on December 21, 2005;Download;Product Support List Windows 95/98/Me – 81.98.
- Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems. No new official releases were later made for these systems.
- Windows 2000, 32-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006;Download.(Products supported list also on this page)
- For Windows 2000, 32-bit Windows XP & Media Center Edition also available beta driver 93.81 released on November 28, 2006;ForceWare Release 90 Version 93.81 - BETA.
- Linux 32-bit: 96.43.23 released on September 14, 2012;Download
The drivers for Windows 2000/XP may be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or theAeroeffects of these operating systems.
Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
Competing chipsets
editSee also
editReferences
edit- ^Ross, Alex (April 26, 2000)."NVIDIA GeForce2 GTS Guide".SharkyExtreme. Archived fromthe originalon August 23, 2004.
- ^Lal Shimpi, Anand (April 26, 2000)."NVIDIA GeForce 2 GTS".Anandtech. p. 2.RetrievedJuly 2,2009.
- ^Lal Shimpi, Anand (April 26, 2000)."NVIDIA GeForce 2 GTS".Anandtech.RetrievedJune 14,2008.
- ^Witheiler, Matthew (July 17, 2000)."ATI Radeon 64MB DDR".Anandtech.RetrievedJune 14,2008.
- ^Lal Shimpi, Anand (August 14, 2000)."NVIDIA GeForce 2 Ultra".Anandtech.RetrievedJune 14,2008.
- ^Lal Shimpi, Anand (April 25, 2000)."ATI Radeon 256 Preview (HyperZ)".Anandtech. p. 5.RetrievedJune 14,2008.
- ^"Press Release-NVIDIA".nvidia.RetrievedApril 22,2018.
- ^FastSite (December 27, 2000)."ATI RADEON 32MB SDR Review".X-bit labs. Archived fromthe originalon July 25, 2008.RetrievedJune 14,2008.
- ^"3D accelerator database".Vintage 3D.Archivedfrom the original on October 23, 2018.RetrievedAugust 30,2024.
- ^"NVIDIA GeForce2 MX PCI Specs".TechPowerUp.RetrievedAugust 30,2024.
- ^"NVIDIA NV15 GPU Specs | TechPowerUp GPU Database".RetrievedAugust 30,2024.
External links
edit- Nvidia: GeForce2 leading-edge technology
- Nvidia: GeForce2 Go
- ForceWare 71.84 drivers, Final Windows 9x/ME driver release for GeForce 2 GTS/Pro/Ti/Ultra
- ForceWare 81.98 drivers, Final Windows 9x/ME driver release for GeForce 2 MX series
- ForceWare 71.89 drivers, Final Windows XP driver release for GeForce 2 GTS/Pro/Ti/Ultra
- ForceWare 94.24 drivers, Final Windows XP driver release for GeForce 2 MX series
- Tom's Hardware VGA Charts (w/ GF2)
- techPowerUp! GPU Database