The history of video processors, part 2: 3Dfx Voodoo

Part 1: 1976 - 1995



3Dfx Voodoo: changing the rules of the game



Released in November 1996, the 3Dfx graphics card consisted of a 3D-only board that required a VGA adapter to a separate 2D card to be connected to the display.



Cards were sold by many companies. Orchid Technologies was the first to enter the market with Orchid Righteous 3D for $ 299. This motherboard is notable for having mechanical relays that “clicked” when the chipset was in operation. In subsequent versions, as in devices of other suppliers, solid-state relays were already used. This card was followed by Diamond Multimedia Monster 3D, Colormaster Voodoo Mania, Canopus Pure3D, Quantum3D, Miro Hiscore, Skywell (Magic3D) and 2theMAX Fantasy FX Power 3D.



Almost in one day, Voodoo Graphics revolutionized graphics for personal computers, turning many other devices into obsolete ones, including a wide range of cards designed for 2D only. The 3D market in 1996 favored S3, which gained almost a 50 percent share. But soon everything had to change. It is estimated that at the peak of Voodoo's popularity, 3Dfx captured 80-85% of the 3D accelerator market.









Diamond Multimedia Monster 3D (3dfx Voodoo1 4MB PCI)



Around the same time, VideoLogic developed tile based deferred rendering technology (TBDR), which eliminated the need for large-scale Z-buffering (removing overlapped / hidden pixels from the finished render): it discarded everything except visible geometry, after which texturing, shading, and lighting were applied to the remaining objects. The resulting frame was divided into rectangular tiles, and each tile with its polygons was rendered and transferred to the output. Polygon rendering was performed after the calculation of the pixels required for the frame and polygon trimming was completed (Z-buffering was performed only at tile levels). Thanks to this, only the very minimum of calculations was required.



The first two series of chips and cards were created by NEC, and Series 3 (Kyro) chips were made by ST Micro. The first card was used exclusively in the Compaq Presario PC and was known as Midas 3 (Midas 1 and 2 were prototypes of the project of the system based on an arcade machine). It was followed by PCX1 and PCX2, sold as OEM products.



The Series 2 chip was originally made for the Sega Dreamcast console, and by the time the Neon 250 desktop card went on retail in November 1999 for $ 169, its competitors were already much more powerful, and in particular had higher resolutions with 32 bit color.



Just before the launch of the Neon 250, the Rendition Vérité V1000 became the first programmable core card to render 2D + 3D graphics using the MIPS-based RISC processor, as well as pixel pipelines. The processor was responsible for preparing the triangles and organizing the load on the conveyors.



The Vérité 1000, originally developed towards the end of 1995, was one of the boards Microsoft used to develop Direct3D. Unfortunately, this card required a motherboard chipset with support for direct memory access (DMA), because Renden used this method to transfer data through the PCI interface. The V1000 was quite successful compared to almost all other consumer graphics cards before the advent of Voodoo Graphics, which doubled its 3D performance. The board was relatively inexpensive and provided a good set of features, including edge smoothing for low-budget gamers and Quake hardware acceleration. However, game developers were frightened by the data transfer model through DMA.



Like 1996, the subsequent 1997 was another busy year for the consumer graphics industry.



ATI stepped from victory to victory: the company released Rage II, which was followed in March by 3D Rage Pro. It was the first AGP 2x card and the first product released by ATI, the 3D Engineering Group, formed in 1995.









ATI 3D Rage Pro



The Pro card in the version with 4 MB of memory was almost equal in performance to Voodoo Graphics, and in the version with 8 MB and the AGP interface it overtook the 3Dfx card. With an expanded 4KB cache and added edge smoothing, Rage Pro has enhanced Rage II perspective correction, as well as texturing features and trilinear filtering performance. Also, a coprocessor for floating point operations was built into it, reducing CPU dependence, as well as hardware acceleration and DVD display support.



Overall, Rage Pro was an excellent addition to the ATI product line and helped the company generate $ 47.7 million in sales in excess of $ 600 million. This success was mainly driven by OEM contracts, integration into consumer and server motherboards, and mobile board options. The prices of cards (usually sold as Xpert @ Work and Xpert @ Play) ranged from $ 170 for a version from 2 MB to $ 200-230 for a model with 4 MB and $ 270-300 for 8 MB. The cost of the version with 16 MB exceeded $ 400.



ATI expanded its portfolio by acquiring Tseng Labs' intellectual property for $ 3 million in December 1997 and recruiting 40 company engineers. It was a cheap purchase because Tseng’s failure to integrate RAMDAC into its cards caused a sharp decline in sales, from 12.4 million in 1996 to 1.9 million in 1997.



In March 1997, 3DLabs announced the release of a new version of the Permedia series of boards (“Pervasive 3D”), built on Texas Instruments 350nm process technology (Intel and Intel technologies were used in the previous versions of Permedia and Permedia NT for workstations). The first board had performance below standards, and the NT model corrected the situation a bit thanks to the additional Delta chip, which performed the complete preparation of triangles and smoothing, but it cost $ 300. Cards based on Permedia 2 began to ship closer to the end of the year, but instead of keeping up with the gaming heavyweights, they positioned themselves as semi-professional 2D cards with average 3D graphics capabilities.



A month later, ATI and 3DLabs updated their product lines, Nvidia responded with a RIVA 128 (Real-time Interactive Video and Animation accelerator) card and added Direct3D support through the rendering of triangular polygons.



The company maintained contact with ST Micro, which produced the chip in its new 350-nanometer process technology, and developed RAMDAC with a video converter. Although at first drivers caused problems (especially serious ones in Unreal), the card showed sufficient performance in games like Quake 2 and 3, and was also in the top lines of most benchmarks .









Diamond Viper V330 PCI (Nvidia RIVA 128)



She became the fundamental card that Nvidia has been looking for since 1993. The card was so successful that Nvidia had to look for additional suppliers to cope with demand; she made a deal with TSMC to supply the Riva 128ZX. Nvidia 3D cards at the end of 1997 occupied about 24% of the market and were in second place after 3Dfx Interactive, mainly due to the Riva 128 / 128ZX.



Nvidia's funds have also been replenished thanks to Sega funding the NV2 chip as a potential graphics device for the Dreamcast console, despite the fact that the contract was ultimately concluded with NEC / VideoLogic.



The competing 3Dfx also collaborated with Sega on this project and was almost certain that it would supply equipment for the console, but the contract was later terminated. 3Dfx filed a lawsuit for $ 155 million, claiming that Sega misled her by claiming to use 3dfx equipment and, in turn, provided her with access to confidential material related to graphic know-how. A year later, companies out of court agreed on compensation of 10.5 million.









3Dfx-based Sega BlackBelt prototype



The Dreamcast prototype project called “Black Belt” was just one of the facets of the busy year of 3Dfx Interactive.



On March 31, 1997, 3Dfx released Quantum3D. SGI and Gemini Technology have entered into a partnership agreement with the company to work on high-end graphics systems for enthusiasts and professionals based on the new 3dfx SLI (Scan Line Interleave) technology. It used a daughter card with a second chipset, or two or more cards connected by cable in the same way as it is done today in Nvidia's SLI and AMD's Crossfire. After connecting, each card or logical unit in the case of single-board SLI cards transmitted half the raster lines of the frame to the display.



Also, SLI increased the maximum screen resolution from 800 x 600 to 1024 x 768 pixels. The Obsidian Pro 100DB-4440 (two separate cards, each of which had an Amethyst daughter card) retailed for $ 2,500, and a single-card SLI system like the 100SB-4440 and 4440V cost $ 1,895.



In the summer of 1997, 3Dfx announced the first public offering (IPO) and released Voodoo Rush as an attempt to introduce a single map with 2D and 3D features. However, the final product, unable to use the Rampage chip proposed for it, resulted in a stripped-down version of Voodoo. The SST-1 chip on this card supported Glide API games, while the mediocre Alliance and even the much worse Macronix chip worked with other 3D games and 2D applications. This led to the appearance of artifacts on the screen, since the 3dfx chip / memory operated at a frequency of 50 MHz, and the Alliance AT25 operated at a frequency of 72 MHz.



The situation worsened by the fact that the Voodoo Rush frame buffer was essentially halved, because it was split between 3D and 2D chips, limiting the resolution to 512x384. The refresh rate also left much to be desired, because the Alliance and Macronix chips were limited by the RAMDAC frequencies of 175 and 160 MHz.



Shortly after the introduction of Voodoo Rush, Rendition launched the Vérité V2100 and V2200. The cards still could not compare in performance with the first Voodoo, and were barely able to compete with the budget Rush. The company's development and research department lagged far behind its competitors, and game developers showed little interest in these cards, so they turned out to be the company's latest commercial graphics products.



There were other projects in the Rendition arsenal, including the addition of the Fujitsu FXG-1 geometric processor , which other suppliers tried to integrate into a single chip, in the dual-chip version of the V2100 / V2200. The FXG-1 worked on a card with the grand name Hercules Thriller Conspiracy, which, together with the V3300 and 4400E, remained unfinished projects. In September 1998, the company acquired Micron for $ 93 million, hoping to combine the integrated DRAM LSI technology with experience in developing Rendition graphics products.









Reference Board Rendition Verite V2200



With the expansion of capabilities and increased productivity, the prices for graphic cards also increased, and many suppliers who were not able to defeat the leaders represented by ATI, Nvidia and 3Dfx rushed to fill the card market for less than $ 200.



Matrox released Mystique (weakened by the lack of OpenGL support) for $ 120-150, the S3 introduced the ViRGE line, the base model of which cost $ 120, and the DX and GX, respectively, $ 150 and $ 200. S3 diversified its product line to ensure stable sales, adding a mobile card with dynamic power management (ViRGE / MX) and a desktop ViRGE / GX2 with TV-out, S-Video and DVD playback support.



Even in a lower niche were the Laguna 3D series by Cirrus Logic, 9750/9850 from Trident and SiS 6326, which fought for the attention of gamers. In the case of Laguna3D, the bargain price of $ 99 could not outweigh the reduced performance, poor 3D picture quality and stability issues that were not present in cards of the same price range like the S3 ViRGE VX.



After the release of Laguna3D, Cirrus Logic quickly left the graphics industry. But before that, the company released a set of budget graphics cards with 16-bit color in the $ 50 price category, the most notable of which were the Diamond SpeedStar and Orchid Kelvin 64 series.



Trident also set its sights on the entry-level segment, releasing 3DImage 9750 in May and shortly afterwards 9850 with support for the AGP 2x bus. The 9750 was a card for PCI or AGP 1x and had many problems with graphics quality and rendering. 9850 eliminated some of these oddities, but texture filtering was still lame.



SiS created its product for the budget 3D graphics market, releasing in June 6326, which usually sold at a price of $ 40-50. The card provided good image quality and outperformed many other budget cards in performance. Although it never posed a threat to high-end cards, the 6326 for 1998 was sold with a circulation of seven million devices.



The long saga, which over time combined elements of myths and urban legends, was born at the Assembly gaming event in June 1997, when BitBoys told the world about their Pyramid3D graphics card. The furore project was a collaboration between Silicon VLSI Solutions Oy, TriTech and BitBoys.



But Pyramid3D was never released, due to lengthy debugging and constant modifications, the project was postponed, and TriTech lost the lawsuit about the patent for the sound chip, which led the company to bankruptcy.









A screenshot of the demo showing the realism that the Glaze3D card should have achieved .



On May 15, 1998, Bitboys announced their second project, the Glaze3D chip. They promised maximum performance in their class and planned for release at the end of 1999. Closer to the time of the grand release, BitBoys announced in October on SIGGRAPH99 a modified version of the board that got rid of the RAMBUS memory and memory controller in favor of Infineon's 9MB internal DRAM.



But debugging and production problems again led to the termination of the project.



The company has built a reputation for missing deadlines and creating only high-profile promises. Glaze3D was later redesigned, codenamed Ax, trying to catch up with competitors thanks to support for DirectX 8.1. The new chip was supposed to debut in the form of an Avalanche3D card at the end of 2001; at the same time, the third version of Glaze3D, code-named Hammer, was already under development, which had already promised DirectX 9 support.



Prototypes of Glaze3D boards were created on the first manufactured chips, but all work stopped when Infineon stopped production of embedded DRAM in 2001 due to increasing financial losses. Due to the lack of a partner manufacturer of cards, Bitboys finally abandoned the niche of desktop graphics and focused on the mobile graphics card market.



BitBoys Care and AMD Miscalculation: In May 2006, ATI acquired BitBoys for $ 44 million and announced the opening of a European design center. Soon after, ATI and Nokia entered into a long-term strategic partnership. Just a couple of months later, the then successful AMD announced the acquisition of ATI for a very high price of 5.4 billion. The mobile project department, which included BitBoys employees, was renamed Imageon and, due to a serious management oversight, was sold for Qualcomm for 65 million in January 2009. The latter continued to produce graphics chips called Adreno (Radeon anagram) as an integral component of the hugely popular Snapdragon SoC.


Intel released its first (and so far last) commercial discrete 3D chip for desktop gaming in January 1998. The i740 began with NASA's flight simulation project for the Apollo space program, run by General Electric, which was later sold to Martin Marietta, three years later, merged with Lockheed. This project was redesigned by Lockheed-Martin in Real3D for professional graphics products, the most curious of which were Real3D / 100 and Real3D / Pro-1000. In the board for arcade machines Sega Model 3 there were two graphics systems Pro-1000.



Lockheed-Martin then created a joint project with Intel and Chips and Technologies, calling it Project Aurora. In January, a month before the i740, Intel acquired 20 percent of Real3D. By this time, Intel had already acquired 100 percent of Chips and Technologies in July 1997.



The i740 combined the resources of two separate graphic and texture chips in R3D / 100, but in it Intel rather strangely implemented AGP texturing: the textures were loaded into system memory (the rendering buffer could also be stored in RAM). Some products used the card frame buffer to store textures, while if the frame buffer was full or the texture was too large to be stored in local graphics memory, the textures were loaded into the system RAM.



To minimize delays in Intel's design, we used the AGP function called Direct Memory Execute (DiME), which called only those textures that were required for rasterization, and left the rest in the system RAM. The performance and picture quality of the cards were acceptable, and the speed roughly corresponded to the high-end cards of the previous year. Pricing - $ 119 for a 4 MB model and $ 149 for 8 MB - reflected Intel’s aggressive marketing. The i740 was sold under the Intel brand, as well as the Real3D StarFighter and Diamond Stealth II G450.









Intel740 / i740 AGP Graphics Card



Intel designed the upgraded i752 chip, but the lack of interest from OEMs and the gaming community as a whole led the company to stop commercial production. Several boards were released from the manufacturer, but like the i740, they were used in integrated graphics chipsets.



Lockheed-Martin closed its Real3D division in October 1999, and its related intellectual property was sold to Intel. Many department employees then switched to Intel or ATI.



In February 1998, ATI redesigned Rage Pro, which essentially consisted in renaming the card to Rage Pro Turbo and optimizing drivers for synthetic benchmarks. There were practically no other changes, with the exception of the price that increased to $ 449. Starting with beta2, drivers gradually improved gaming performance.



Following this, in August, ATI launched the Rage 128 GL and VR, the company's first products that former Tseng Labs engineers worked on. However, the supply volume for retail channels until the new year was far from ideal, which essentially killed ATI's chances of capturing part of the gaming market, as happened with the OEM market. The cards had 32 MB of internal RAM (16 MB and 32 MB in the All-In-Wonder 128 version) and an optimal memory architecture that allows the card to overtake Nvidia TNT while increasing the screen resolution and switching to 32-bit color. Unfortunately for ATI, many of the games and equipment of many players were sharpened in 16-bit color. Image quality was almost the same as that of mainstream rival S3 and Nvidia mainboards, but lagged far behind that provided by Matrox.



However, ATI was enough to become a leader among 1998 graphics card vendors with 27% of the market, and earn a net income of CAD 168.4 million with sales of 1.15 billion.



In October of that year, ATI announced the acquisition of Chromatic Research for 67 million, whose MPACT media processors have found their place in many PC TV systems, in particular Compaq and Gateway. The chips provided very good 2D graphics performance, excellent sound reproduction and MPEG2, but had limited 3D performance and a high price of $ 200. Ultimately, insurmountable software problems doomed the company to shut down after four years of life.



Two months after the i740 had a small impact on the graphics market, 3Dfx released Voodoo 2. Like its predecessor, this card was designed only for 3D, and despite its impressive performance, it had a complex system. The boards contained two texturing units, making them the first example of multitexturing functions in graphics cards, and generally used three chips, unlike one in competitor cards, combining 2D / 3D functions.





Demonstration of GLQuake running on a Pentium MMX 225 MHz with a 3Dfx Voodoo 2 card



Quantum3D implemented on the basis of Voodoo 2 the product Obsidian2 X-24 - one SLI card that could be connected to a daughter 2D-card, SLI SB200 / 200SBi for one slot with 24 MB EDO RAM and Mercury Heavy Metal, which is four SLI-cards 200SBi connected via a controller board ( AAlchemy ), which was used as SLI bridges used in modern systems with several GPUs.



The latter was a professional system designed for visual simulators, and therefore had a price of $ 9,999 and required an Intel BX or GX server board with four adjacent PCI slots.



The release of Voodoo Banshee was announced in June 1998, but it only appeared in retail three months later. The card transferred the 2D functions of the slow AWOL Rampage chipset to one texture mapping unit (TMU), so although 3dfx could now sell a single chip with 2D and 3D functions and significantly reduced production costs, Banshee was significantly behind the Voodoo 2 in rendering capabilities polygons with multitexturing.



The revolution that 3dfx provoked three years earlier did not happen this time.



In the field of raw 3D performance, Voodoo 2 had no equal, but competitors quickly caught up with it. At a time of increasing competition from ATI and Nvidia, 3dfx sought to increase revenues by marketing and self-selling boards, which was a long list of its partners. To do this, 3dfx acquired STB Systems for 141 million on December 15, but this enterprise was a huge mistake, because the quality and cost of production of the company used by the company (Juarez) could not be compared with the Taiwan factories (TSMC) used by Nvidia. They could not compete with the Taiwanese manufacturing partner ATI - the company UMC.



Many former 3dfx partners have now made deals with Nvidia.



While 3dfx was having problems in the market, Nvidia released Riva TNT on March 23 (the name stands for TwiN Texel, not explosive). Adding a second parallel pixel pipeline doubled the pixel fill and rendering speeds; The promising (for 1998) SDR memory with a capacity of 16 MB - 8-16 MB Voodoo 2 had a slower type of EDO. Although the card was a strong competitor, its performance decreased due to its own complexity - the chip with eight million processors, manufactured using the TSMC 350-nanometer process technology, could not work with the frequency of 125 MHz of the Nvidia core / memory, therefore in the finished product the frequency had to be reduced to 90 MHz. The speed decreased by a noticeable 28%, and this was enough to ensure that Voodoo 2 barely remained the leader in performance, mainly thanks to Glide.



But even after the performance degradation, the TNT remained an impressive card. Its AGP 2x interface allowed playing in 1600 x 1200 resolution with rendering in 32-bit color and 24-bit Z-buffer (setting the image depth). This was a big improvement over Voodoo 2's 16-bit color support and 16-bit Z-buffer. TNT battled Voodoo 2 and Banshee with a wider range of features, improved scaling with higher CPU frequencies, excellent AGP texturing and increased 2D performance. Large-scale deliveries of the card began only in September.



But not everything went according to Nvidia's plan, at least not immediately.



On April 9, SGI filed a lawsuit against the company, claiming that it violated its patent for texture mapping technology. A compromise in July 1999 provided Nvidia access to SGI's professional graphics portfolio; at the same time, SGI closed its own graphics department and handed over its low-level graphics department to Nvidia . This almost free transfer of intellectual property is considered one of the main reasons for the rapid bankruptcy of SGI.



The main market players dominated the media in the first months of the year, but in June and July, attention was drawn to two dying stars of the industry.



On June 16, Number Nine released its Revolution IV card.



She could not compete in 3D with Nvidia and ATI products, so the company decided to strengthen its position in the 2D performance market.









SGI flat panel complete with Revolution IV-FP



Number Nine has always preferred 2D performance, rather than investing in 3D technology, and as a result, game cards like Nvidia TNT began to put pressure on both markets. Therefore, the company decided to bet on the only weak point of most game cards: high resolution with 32-bit color.



To do this, Number Nine added a 36-pin OpenLDI connector to the Revolution IV-FP, which connected to the SGI flat screen that came with this card. A set of SGI 1600SW with a diagonal of 17.3 inches (1600x1024) and Revolution IV-FP were initially sold at retail for $ 2795.



It was the last self-made Number Nine card, after which it returned to selling S3 and Nvidia products. In December 1999, the company's assets were acquired by S3 and sold to the original Number Nine design team, which organized Silicon Spectrum in 2002.



At the 1998 E3 Expo, S3 announced the release of Savage3D, and unlike the TNT and Voodoo Banshee, the card soon went retail. However, the payback for the quick release was raw drivers. They especially influenced OpenGL games; the situation was so serious that S3 created a mini OpenGL driver specifically for Quake games.



Initially, the S3 specification claimed a frequency of 125 MHz, but problems with heat dissipation led to the finished product being delivered at a frequency of 90-110 MHz, however, many magazines and review websites still received pre-production samples with 125 MHz. Later, the Savage3D Supercharged was launched with a frequency of 120 MHz, and Hercules and STB began selling the Terminator BEAST and Nitro 3200, having frequencies of 120/125 MHz. Even though OpenGL emulation and DirectX performance were limited by driver support, pricing below $ 100 and acceptable game and video performance ensured sales.



Between 1997 and 1998, the number of graphic card suppliers who left the market increased. Among them were Cirrus Logic, Macronix, Alliance Semiconductor, Dynamic Pictures (sold by 3DLabs), Tseng Labs, Chromatic Research (both purchased by ATI), Rendition (sold by Micron), AccelGraphics (purchased by Evans & Sutherland) and Chips and Technologies (acquired by Intel) .



In 1999, the situation became even more acute.



In January, the SiS 300 was released - a budget graphics card for business machines. For 1999, the Sis 300 had minimal 3D performance, and the 2D capabilities could not be compared with most SiS competitors in the retail market. The situation was aggravated by a single pixel pipeline. But fortunately for SiS, OEM did not complain about anything, because the card had a sufficient set of functions for them: a 128-bit memory bus (64-bit in the SiS 305 version), 32-bit color support, DirectX 6.0 (DX7 for 305 ), multitexturing, TV-out and hardware MPEG2 decoding.



In December 2000, the SiS 315 appeared with a 256-bit data bus, DirectX 8 support, full-screen anti-aliasing, a second pixel pipeline, conversion and lighting engine, motion compensation in DVD-video and DVI support. Performance was at the level of GeForce 2 MX200. The same chip 315 formed the basis of the SiS 650 chipset for Socket 478 (Pentium 4) boards, released in September 2001, and a system based on the SiS552 chip, which appeared in 2003.





Budget gaming: Unreal Tournament 2003, launched with the SiS 315 card.



In addition to SiS products, users with a limited budget had a significant choice of cards. Among them was Trident Blade 3D (about $ 65), the tolerable performance of which in 3D (and mediocre driver support) was generally at the level of Intel i740.



Inspired by Trident, it launched the Blade 3D Turbo , whose frequency was increased from 110 to 135 MHz, which helped catch up with the upgraded Intel i752. Unfortunately for Trident, its integrated graphics development contract with VIA ended unexpectedly after VIA acquired S3 Graphics in April 2000 .



Trident's core business was heavily dependent on large-scale, low-cost chips, primarily in the mobile sector. Blade 3D Turbo has been upgraded to Blade T16, T64 (143 MHz) and XP (166 MHz). But Trident’s 3D technology development was much slower than the market as a whole. So much so that even long-delayed budget proposals like the SiS 315 easily outperformed the new cards of the company . In June 2003, the Trident graphics department was sold to a SiS subsidiary called XGI.



The S3 Savage4 is a step forward in performance compared to SiS and Trident. The card was announced in February, and it came out in May at a price of $ 100-130, depending on the amount of internal memory (16 or 32 MB). S3 texture compression technology, introduced in Savage3D, guaranteed the transfer of textures up to 2048x2048 in size even with a limited 64-bit memory bus.









Diamond Viper II Z200 (S3 Savage4)



Savage4 became the first S3 card with multitexturing function and the first card with support for AGP 4x interface. But even the advanced drivers and a large set of functions could not compensate for the fact that the card could hardly reach the performance level of the previous generation of 3dfx, Nvidia and ATI products. This cycle repeated at the end of the year when Savage 2000 was released. The card reached the same level with TNT2 and Matrox G400 with resolutions of 1024x768 and lower, but the story was sadder at 1280x1024 / 1600x1200.



The first card from the 3dfx Voodoo3 series appeared in March, followed by extensive advertising on television and in the press. The card received an updated logo (now with a lowercase letter "d") and a bright box design. The long-awaited Rampage chipset has not yet been released, so the boards had an almost unchanged architecture, slightly modified by the Avenger chipset. Cards were still limited to 16-bit color, 256x256 textures, and the lack of hardware transformations and lighting (T&L). These factors gradually became critical for game developers, and 3dfx continued to disappoint them, breaking its promises about the architecture and feature set.



3dfx blames earthquake for declining revenue , but not so much affected by ATI and Nvidia. As another sign of the growing problems of 3dfx, the company announced in December that its proprietary graphics API Glide would finally reveal its source code , and this was at a time when DirectX and OpenGL were still gaining popularity among game developers.



In March, Nvidia Riva TNT2 was also released, including the first Ultra-branded motherboard with a faster core and higher memory speeds, while Matrox announced the G400 series.



TNT2 used the TSMC's 250nm process technology and achieved the performance that Nvidia hoped to get in the first TNT. The card completely overtook Voodoo 3, the only exceptions were applications using the 3DNow instruction set! AMD processors in conjunction with OpenGL. To keep up with 3dfx and Matrox, TNT2 has added a DVI output for flat-panel displays.



Meanwhile, the Matrox G400 managed to overtake both Voodoo 3 and TNT2 in most aspects, however, OpenGL support was still lame. The card, priced at $ 199-229, provided an excellent price-performance ratio, as well as image quality and feature set. The ability to output a signal to two monitors through dual display controllers (called Matrox DualHead) was the beginning of a trend for the company to support multi-monitor configurations. In this case, the secondary monitor was limited to 1280x1024.



The G400 also introduced Environment Mapped Bump Mapping (EMBM) technology, which provides improved texture mapping. The $ 250 high-frequency G400 MAX was the fastest consumer card on the market, until in early 2000, boards based on the GeForce 256 DDR such as Creative Labs 3D Blaster Annihilator Pro appeared on store shelves.



From that moment, Matrox focused on the professional market, returning to the gaming market for a short period of time in 2002 with Parhelia. Support for three monitors could not outweigh the mediocre performance in games and compete with the new wave of equipment compatible with DirectX 9.0.





Matrox G400 EMBM technology demo card



By the time smoke had cleared after the release of 3dfx, Nvidia and Matrox, 3DLabs appeared on the scene with the long-awaited Permedia 3 Create! The card was announced a few months before, it was aimed at a professional user interested in games. Therefore, 3DLabs gave priority to 2D, taking advantage of experience in the development of professional graphics company Dynamic Pictures, who designed a line of excellent cards for Oxygen workstations.



Unfortunately for 3DLabs, workstation graphics then required sophisticated polygonal modeling, which was usually paid for by the low speed of texture mapping. This was almost completely opposite to the requirements for game cards, for which texturing and beauty of the picture were preferable to complex wireframing.



Too expensive and slow compared to TNT2 and Voodoo 3 in games, and not powerful enough compared to competitors at workstations, Permedia 3 was the last attempt of 3DLabs to create a game card. Immediately after that, 3DLabs focused its efforts on Oxygen cards based on GLINT R3 and R4; they ranged in price from $ 299 (VX-1) to $ 1,499 (GVX 420), and the Wildcat series (like the Wildcat II-5110 for 2,499) was still based on the Intense3D ParaScale GPUs obtained through the purchase of Intense3D from Intergraph in July 2000 Since 2002, 3DLabs began integrating its own graphics into the Wildcat series when Creative Technology bought the company with its P9 and P10 processors .



In 2006, the company left the desktop market and concentrated on graphics for the media; its division was merged with Creative's SoC department, which was later renamed ZiiLabs and sold to Intel in November 2012.



After the debut of Rage 128, ATI’s steps were progressive. At the end of 1998, the company added support for AGP 4x and increased the frequency of Rage 128, releasing a version of the card called Pro, which also introduced options for video capture and TV-out. The gaming performance of the Rage 128 Pro was approximately equal to the Nvidia TNT2, but did not reach the TNT2 Ultra, and ATI intended to solve this problem with its Project Aurora.









ATI Rage Fury MAXX in combination with two Rage 128 Pro chips on one board



When it became clear that ATI would not have a chip that could win a performance race, the tactics were changed and the project was implemented as the Rage Fury MAXX, which contained two Rage 128 Pros on one circuit board. The technical indicators were impressive: two chips were responsible for rendering alternating frames, distributing the game load equally. However, in practice, despite the fact that the card exceeded the results of the previous generation, it could not compare with the S3 Savage 2000 and did not reach the upcoming GeForce 256 DDR. The latter was only slightly more expensive: $ 279 versus $ 249 for ATI.



Less than two months after announcing the release of the Rage Fury MAXX, Nvidia announced that the GeForce 256 SDR will appear on October 1, followed by the DDR version in February 2000. It will be the first card in which this type of RAM will be used. The 23 million transistor chip, based on the TSMC 220nm process technology, was the first graphics chip to be called the GPU (Graphics Processing Unit) because it added a conversion and lighting engine (TnL or T&L).



This engine allowed the graphics chip to take on complex floating-point calculations to convert 3D objects and scenes, as well as lighting into a rendered 2D image. Previously, these calculations were performed by the CPU, which often became the “bottleneck” and usually limited the level of detail possible.





Nvidia Grass Demo (GeForce 256)



The status of the GeForce 256 as the first card to incorporate programmable pixel shaders using T&L has long been a subject of controversy. This happened because T&L was also built into many architectures either at the prototype stage (Rendition VéritéV4400, BitBoys Pyramid3D, 3dfx Rampage), or in an intermediate form (3DLabs GLINT, Matrox G400 WARP), or as a separate chip on the board (Hercules Thriller Conspiracy )



However, none of these projects went on retail. Moreover, due to the fact that Nvidia first used the architecture with four pipelines, the built-in performance provided it with leadership among competitors. Along with the T&L engine, this allowed the company to promote the GeForce 256 as a professional card for workstations.



A month after the release of the desktop option, Nvidia announced the development of the first line of professional Quadro cards for workstations (SGI VPro V3 and VR3 models) based on the GeForce 256. The cards used SGI graphics technology, which Nvidia received through the cross-licensing agreement signed in July 1999 .



Nvidia's annual profit of 41 million with revenue of 374.5 million exceeded 1998 figures (profit of 4.1 million and income of 158.2) and was a huge step forward compared with 1997 revenue of 13.3 million. Nvidia’s earnings were affected by Microsoft’s upfront payment of 200 million for NV2A (the Xbox graphic core), as well as 400 million posting bonds on the secondary market.



However, these numbers faded compared with ATI’s revenue of 1.2 billion and revenue of 160 million, thanks to a 32 percent share of the graphics market. But the company was on the verge of losing a significant share of OEM business due to a series of integrated Intel 815 graphics chips.



This article is the second of four in a series. In the third part, we will approach modernity and the time when the industry made a turn to large consolidation, leaving room for only two players. Thus began the era of confrontation between GeForce and Radeon.



All Articles