Video cards have come a long way in a relatively short period, with the first units in the early ‘80s only capable of displaying eight colors. Since that time, improvements to manufacturing processes have shrunk die sizes from 200+ nanometers (nm) in the ‘90s down to just 28 nm today. From pre-VGA to 4K, we cover the most important video cards of the past thirty five years.
1980s – The Early Days of Consumer Video Cards
The first consumer video cards appeared in the mid-‘80s and primarily tackled 2D workloads rather than 3D. Resolutions were also quite small, with SVGA (800 × 600) introduced late in the decade.
IBM Monochrome Display Adapter
Resolution: 720 × 350 pixels
Said to be the first video card ever, it displayed 80 columns × 25 lines of text and symbols. Those in turn were contained in a 9 × 14 pixel box, so the display had a “resolution” of 720 × 350 pixels. It also contained a parallel printer port so that users did not have to add another card for printer connectivity.
Intel iSBX 275 Video Graphics Controller
Resolution: 256 × 256
The iSBX 275 featured 32 KB of memory and could display eight colors at 256 × 256 or monochrome at 512 × 512. At the time, the iSBX was a revolutionary card and only cost $1,000.
ATI VGA Wonder
Resolution: 800 × 600
ATI arrived onto the video card scene in the mid-‘80s working primarily with OEM manufacturers like IBM. It wasn’t until the late ‘80s that they produced their first consumer video cards, the EGA and VGA Wonder series. The latter was compatible with IBM PCs and retailed for $449 in 1987.
1990s – Video Cards and 3D Accelerators
In the early ’90s, competition in the 2D accelerator card market increased dramatically with newcomers such as NVIDIA and Matrox competing with old names such as ATI. Many companies left the market during this time as well, largely due to the fierce competition.
Resolution: 1200 × 600
NVIDIA’s first product was a combination 2D and 3D graphics accelerator that was unfortunately doomed by the release of DirectX 1.0 a few short months later. The NV1 utilized a 3D rendering process that was not supported by DirectX 1.0 and as a result, many games could not run on the card.
ATI Rage 1
Resolution: 1280 × 1024
ATI’s very first 3D accelerator, the Rage 1 was capable of handling both 2D tasks as well. Like the NVIDIA NV1, it had some issues with DirectX compatibility. Overall, it was a decent 2D card but a poor 3D performer.
3Dfx Voodoo 1
Resolution: 640×480 (3D)
A 3D-only graphics accelerator, the Voodoo 1 required you to have a separate video card for 2D graphics. For 3D tasks, the Voodoo graphics accelerator greatly outperformed its competition and established 3Dfx’s dominance in the 3D accelerator market.
NVIDIA RIVA 128
Resolution: 960 × 720 (3D)
Short for Real-time Interactive Video and Animation, the RIVA 128 was one of the first AGP video cards and introduced many consumers to the NVIDIA name. It supported higher resolutions than its Voodoo 1 competition but poor driver support meant it couldn’t perform to the same standard.
3Dfx Voodoo 2
Resolution: 800 × 600 (3D)
3Dfx’s sophomore 3D accelerator supported a maximum resolution of 800 × 600 and cemented their top spot in the 3D market. As with the original Voodoo, the Voodoo 2 required you to have a 2D card. It also had a revolutionary feature called Scan-Life Interleave (SLI), which allowed two Voodoo 2 cards to be installed into a system in order to boost performance. This technology would later be purchased and used by NVIDIA.
NVIDIA RIVA TNT
Resolution: 960×720 (3D)
The RIVA TNT had many things going for it when compared to the Voodoo 2. It supported 32-bit true color, 1024 × 1024 pixel textures, had a second pixel pipeline, and used faster video memory. Unfortunately, most games supported 3Dfx’s Glide API at the time even though the RIVA TNT compared favorably against the Voodoo 2 hardware-wise.
3Dfx Voodoo 3
It wasn’t until the Voodoo 3 that 3Dfx implemented 2D onto their flagship video card. Reviewers noted that it was a fast performer but unfortunately lacked important features found in its competitors such as 32-bit color rendering.
RIVA TNT 2
Resolution: 2046 x 1536
The RIVA TNT 2 finally allowed NVIDIA to compete on a level playing field with 3Dfx. Voodoo 3 cards tended to push faster frame rates, but TNT 2 cards supported 32-bit color and displayed better images. It also helped that both OpenGL and DirectX APIs continued to gain developer support while Glide struggled.
NVIDIA GeForce 256 DDR
Resolution: 2046 x 1536
NVIDIA marketed the GeForce 256 card as the world’s first graphics processing unit (GPU) and it handily beat the 3Dfx competition. While NVIDIA released a single data rate (SDR) version of the GeForce 256 as well, it was the DDR version that wowed buyers.
2000s – 3Dfx Files Bankruptcy and ATI takes on NVIDIA for the Next Ten Years
Even more graphics chipset manufacturers left the business in the 2000s, including the biggest name at the time, 3Dfx. From now on, performance-minded users only had two choices in graphics card chipsets: NVIDIA and ATI.
GeForce 2 GTS
NVIDIA’s second generation GeForce GPU featured a 200 MHz core clock and your choice of 32 MB or 64 GB of DDR memory. It took advantage of the AGP 4x Fast Write interface and was the fastest card out upon its release. One nitpick on the card however, was that it featured rather slow (166 MHz) DDR memory.
ATI Radeon DDR
At this time, ATI decided to rebrand their Rage lineup of video cards as Radeon and reenter the performance graphics card market. The Radeon DDR had a core clock of 200 MHz and a GeForce-beating memory clock of 183 MHz. Despite some driver issues, it compared favorably to the GeForce 2 GTS.
3Dfx Voodoo 4 / Voodoo 5
The Voodoo 4 and 5 were part of the same product lineup, with the Voodoo 4 being more budget oriented and 5 as the high-end performance card. 3Dfx also planned to release a Voodoo 6, but unfortunately declared bankruptcy before that happened. They ended up selling their IP portfolio to NVIDIA.
NVIDIA GeForce 3
NVIDIA continued to keep its performance crown over ATI with the GeForce 3, which came in three versions: GeForce 3, GeForce 3 Ti 200, and GeForce 3 Ti 500. This generation of GeForce cards was the first to be Direct X 8.0 compliant and was the basis for the Xbox’s NV2A GPU.
ATI Radeon 9700 Pro
With the Radeon 9700 Pro, ATI finally took the performance crown away from NVIDIA. It was a 0.15-micron chip with full Direct X 9 support and AGP 8X support. It was also one of the first GPUs to support 64- and 128-bit color depths.
NVIDIA GeForce 6
The GeForce 5 or GeForce FX series of video cards performed terribly compared to ATI’s Radeon cards and it wasn’t until the sixth generation that NVIDIA had a competitive product. The GeForce 6 series took advantage of 3Dfx’s SLI technology (rebranded as Scalable Link Interface) and supported PCI-Express later into its release.
NVIDIA GeForce 8
Originally produced with a 90 nm manufacturing process, the 8 series cards were already top performers. It was refreshed during the middle of its run with a die shrink to 65 nm and support for PCI-Express 2.0. It was also around this time that AMD purchased ATI for $5.4 billion.
AMD Radeon HD 3000
Powered by the Radeon R600 GPU, HD 3000 cards were able to compete and sometimes beat the NVIDIA competition. The Radeon HD 3870 X2 was particularly noteworthy because it was the first dual-GPU graphics card. Rather than two separate cards in SLI, the X2 put two GPUs onto a single card.
AMD Radeon HD 4000
The 4000 series of Radeon cards continued dominate the high-end market with dual GPU cards such as the Radeon 4870 X2. The 4000 series was also some of the first cards to utilize GDDR5 memory and a 40-nm manufacturing process.
NVIDIA GeForce GTX TITAN
For the next few years, NVIDIA and ATI would trade wins over the fastest GPU. Then NVIDIA released the GTX Titan, which cost $1000 and outperformed everything else. The Titan was based on GeForce 600 tech but added water cooling and 6 GB of GDDR5 memory. Remarkably, it was a single-GPU card.
2010s – NVIDIA and ATI Continue to Trade Blows
The current video card landscape sees NVIDIA and ATI continuing a game of one-upmanship with solid products from both sides. Since the release of the original TITAN, NVIDIA has for the most part taken the top spot in benchmarks though the ATI R9 295X2 isn’t far behind performance-wise. Looking to the near future, the most important video cards will likely either be powered by NVIDIA’s 16 nm Pascal or AMD’s new 14 nm Greenland architectures.