PDA

View Full Version : Past generations of NVIDIA good graphics cards-by Wester547


Wester547
07-06-03, 09:45 PM
You've owned them all-The NVIDIA RIVA TNT, the NVIDIA RIVA TNT 2, the NVIDIA GeForce 256, the NVIDIA GeForce 2, the NVIDIA GeForce 3, the NVIDIA GeForce 4 MX and TI, the NVIDIA GeForce FX, and they all have prooven to be amazing 3D graphics accelerators, since no one really wants to bother, I'm going to create a timeline of the released graphics cards and how they were awesome and they're special features, from the TNT to the GeForce FX, and every version, too. Here it is:

1998-The NVIDA RIVA TNT:

The TNT was just plain damned too good to be true, it featured: 32-bit rendering, 16mb of video memory, 2D and 3D on the same chip, and it was as fast as 3DFX's Dominating Voodoo 2! Apparently at the time of release of the TNT, and the TNT featured full Direct-X6 compatiblity, plus since 3DFX was out of buisness, and NVIDIA RIVA TNT truly was the king of the 3-D graphics accelerators, and was the first card to fire a true shot in that long, ugly 3-D graphics accelerator battle.

1998-1999-NVIDIA TNT 2:

The NVIDIA TNT 2 was a much improved version of the TNT, featuring must faster performance, 32mb of video memory, 2d and 3d on the same chip, just as fast as the 3DFX Voodoo 3, with true 32-bit rendering much improved from the TNT 1, featuring full Direct-X6 compatiblity as the RIVA TNT 1 did, and prooved the next new king of 3D accelerators, but not for too long.

1999-NVIDIA GeForce 256:

The NVIDIA GeForce 256 was a revolutionary chip indeed, presenting the arrival of the amazing Hardware T & L, which unfortunately, wasn't tooken advantage of games that were continue to be written for 3 year old computer hardware, but developer.nvidia.com features some amazing demos specifically designed for the TNT, GeForce 256, and above. So you can download some developer's demos there that truly show why the GeForce 256 was said to be incredible, just wasn't prooven. It also featured DDR-SDRAM, and up to 128MB, with up to 22 million transistors, and a 200mhz DDR-SDRAM core, with a fast core speed. And it also featured almost full Direct-X7 compatiblity. Whoa.

2000-NVIDIA's GeForce 2:

3DFX was trying to hold the candle with the Voodoo 5's uninsipiring performance, but failed and was gunned down quite easily by the GeForce 2, NVIDIA's 2000 year date graphics chip. NVIDIA finally finished off 3DFX, with a final blow to 3DFX, having not to be dealt for long, with the superb GeForce 2 Ultra, packed with ultra fast 230mhz of 32-64MB+ DDR-SDRAM, the GeForce 2 Ultra rendered 3DFX's Voodoo 6 board, which should have been out some time in 2001 but was canceled, like nothing and insignificant, irrelavant, and put the final finishing off nail to 3DFX's corpse. The GeForce 2 also featured almost full Direct-X 7 compatiblity, as the GeForce 256 did. But, there was another that could actually hold the candle to the GeForce 2 featuring full total direct-x7 compatiblity, plus more AND that would later begin a new battle-the ATI RADEON graphics processor chip.

2001-NVIDIA's GeForce 3:

The NVIDIA GeForce 3 was a true way to revolutionize 3D gaming, featuring 57 transistors, 76 gigaflops, 300mhz core and 250mhz ddr-sdram, and 64MB of it too, ultimately fast!(500mhz ddr-sdram effective), with up to 36 pixel shader operations per second, and 128 vertex shader instructions! The NVIDIA GeForce 3 introduces many things, the arrival of the gasket blowing Vertex and Pixel Shaders, with much improved 3X antilaiasing, new methods for 3-D and 2-D processing, only using 10% of the CPU capacity(the old GeForce 2 used 60%), :eek:, and featured many times faster performance with amazing new graphics, all in real-time. The Vertex Shaders included amazing physics and animation engine, with the power to create true facial animation and much more, in real time, all powered by the GeForce 3, the pixel shader featured bump mapping technology without adding a vast amount of polys, that would take a rendering farm to display, creating photo-realistic quality graphics, like pixar animation studios, but at real time from about 24 to 60 frames per second, and also included realistic real-time environment mapped reflections, and much more, with improved performance! The GeForce 3 was also capaable of amazing lightning effects powereed by the nfinitex engine and the soft real-time 3-D texture buffered shadow engine, creating specular hightlights, stencil realistic soft shadows, and a true real-time dynamic lightning engine that behaves EXACTLY as expected in the real-world, portraying many technologies found in ID software's currently in develop game named DOOM 3. The GeForce 3 first came on the mac in March 2001, and was for about a astonishingly expensive price of 600$. And i think was 500$ or 400$ when it came out for the PC either during May or June 2001. The GeForce 3 was unreal, featuring full Direct-X8 compatiblity, also the chip was shippped in microsoft's xbox gamign console. But, later in 2001, another beat the geforce 3, but was remained unprooven, named the Radeon 8500....

2002-GeForce 4 TI and MX:

The GeForce 4 MX was similiar to the GeForce 2 MX, just featuring improved performance and image quality, the GeForce 4 TI was at least twice as fast as the GeForce 3, featuring improved nfiniteX2 technology, with all the same light and shadow technology only improved, with pixel shader and vertex shader insttructions designed for Direct-X8.1. The Geforce 4 TI were the king of all 3-D graphics kingdom, and featured new 4x improved antialiainsg technology and more, until the Radeon 9700 Pro rendered it gone with full direct-x 9 compatiblity, twice as fast performance, and much more. Thus, NVIDIA was down, ruined, but not beaten. No, not beaten. NVIDIA would strike back, and radeon had yet to see that.

2003-NVIDIA GeForce FX:


Radeon 9700 was dominating it all, and the newly released NVIDIA GeForce FX 5800 Ultra wasn't that good after all, even though it featured a 500mhz core and memory ddr-2 speed, with full direct-x9 compatiblity, but the radeon 9700 and 9800 still kicked it's ass since it featured 256-bit memory. The GeForce FX also wasn't found in any stores, whoa. The GeForce fx 5800 also took up the PCI slot next to your AGP slot, and had a extremely loud fan. Man. It seemed all over for NVIDIA-until NVIDIA decided to wipe off it's dust and get back into the game, featuring the all new NVIDIA GeForce FX 5900 Ultra, featuring 256MB of DDR-SDRAM II running at 850mhz, god fast, with 450mhz core, full direct-x9 compatiblity, much quieter fan, 256-bit memory, 0.13 micron, and more, and was faster than the 9800 and 9700 pro, but still took the PCI slot next to your AGP slot, and wasn't much faster than most direct x 9 cards in the current generation games, although prooved superior in Doom 3 alpha benchmarks. Heh, heh, heh. And also, the 5900 featured full direct-x 9 compatiblity. Now GeForce fx remains the king, but ati has yet to released it's r420 core graphics chip. And by the way, if you find yourself a geforce fx 5800, you'd do yourself well to avoid it. And even the 9800 pro 256mb version couldn't outperform the fx 5900 ultra, and still wasn't as fast. The bad part is the 5900 ultra is priced at expensive 500$, as the 9800 pro 256mb version is. Then came the FX 5950 ultra, a higher-clocked verison of the FX 5900 ultra, and ati's 9800 xt, a higher clocked version of the 9800 pro, came to compete with NVIDIA's latest gfx, as well, and are both very evenly matched. Expect the NV40 vs R420 scenerio in 2004! Anyway, the war, is far from over. Though, NVIDIA is back on track, and in it's dominating position!



Interesting, huh? The question still remains: Will another enter the arena and dominate both ati or nvidia, or will it end up the other way? Or differently? Besides that, hope you people find this useful, but you probably know all this already :)

extreme_dB
07-07-03, 01:40 AM
And that concludes today's lesson in Revisionist History 101. :)

Edit: Changed my original comment. :D

-=DVS=-
07-07-03, 03:08 AM
Your History is not very accurate Nvidia is not in dominateing position Tied at best with ATI :rolleyes:
GFFX 5900 500$ , same speed 9800Pro Radeon can be bough for 350$ and you don't need to get 256meg version of it :p hehe isn't historians pride not to be a fan and state only facts.

And all the gold age have suddenly stoped , sad realy we had so many competitors in the begining :eek: but as popular saying there can be only ONE king , sooner or later there will be one CPU maker and one GPU producer ;)

StealthHawk
07-07-03, 03:13 AM
Originally posted by extreme_dB
And that concludes today's lesson in Revisionist History 101. :)

Well, some things were certainly revisionist, some things optimistic, and some things correct ;)

The most problematic statement was this:1998-The NVIDA RIVA TNT:

The TNT was just plain damned too good to be true, it featured: 32-bit rendering, 16mb of video memory, 2D and 3D on the same chip, and it was as fast as 3DFX's Dominating Voodoo 2! Apparently at the time of release of the TNT, and the TNT featured full Direct-X6 compatiblity, plus since 3DFX was out of buisness, and NVIDIA RIVA TNT truly was the king of the 3-D graphics accelerators, and was the first card to fire a true shot in that long, ugly 3-D graphics accelerator battle.

Let's not forget that 3dfx went out of business a few years after the release of the TNT, the Voodoo2 was out some 9 months before the TNT, the Voodoo2 also had SLI setups available, and most games were developed and optimized for Voodoo architecture and/or Glide. The TNT was also touted as a Voodoo2 killer, which by the author's own statement was not entirely true ;)

Other omissions are the Riva128, which got nvidia back into the consumer 3d market, and the horrible NV1 which failed years before.

extreme_dB
07-07-03, 04:01 AM
The Riva128 was absolutely plagued by image quality problems.

The launch of the TNT was tainted by the clockspeed fiasco and was slower than expected (previews showed 125MHz, but it was released at only 90MHz). The Voodoo2 was still the premier gaming card, and formed the ultimate 2D/3D setup when coupled with the Matrox Millennium.

The TNT2 was Nvidia's first truly great card, but it had solid competition from the G400Max (best image quality, though scarcely available/late), the Rage128 (fast 32-bit), and the Voodoo3 (best 16-bit performance/quality in current games, and the Glide advantage). My first Nvidia card was the TNT2 Ultra from Guillemot (whatever happened to them?).

Geforce 256 - The first version with SDR was a huge disappointment.

Geforce2 - amazing performance, but it sacrificed image quality, and had crappy 2D quality.

Geforce3 - fantastic new technology, but suffered severe driver problems for several months. By now Nvidia's dominance was cemented.

Geforce4 - Their best products ever in a given time period.

GeforceFX - this lineup features products ranging from slightly disappointing to the biggest disaster in their history. :)

saturnotaku
07-07-03, 07:30 AM
Originally posted by extreme_dB
My first Nvidia card was the TNT2 Ultra from Guillemot (whatever happened to them?).



Guillemot is Hercules - same folks that make those wicked R3xx vid cards. :D

Skuzzy
07-07-03, 07:38 AM
Just FYI: It is inaccurate to say any of the FX line "fully" supports DX9. They are DX9 compatible, but they do not support the full DX9 featureset.
ATI's 9700 and 9800 Pro cards support more of the DX9 features than anything NVidia has released to date.

Just a nit.

StealthHawk
07-07-03, 08:00 AM
A few comments.

Originally posted by extreme_dB
The Riva128 was absolutely plagued by image quality problems.

Yeah, I owned one of these. The IQ wasn't great and there were screen tearing problems, but it was fast and cheap :)

The launch of the TNT was tainted by the clockspeed fiasco and was slower than expected (previews showed 125MHz, but it was released at only 90MHz). The Voodoo2 was still the premier gaming card, and formed the ultimate 2D/3D setup when coupled with the Matrox Millennium.

Can't argue.

The TNT2 was Nvidia's first truly great card, but it had solid competition from the G400Max (best image quality, though scarcely available/late), the Rage128 (fast 32-bit), and the Voodoo3 (best 16-bit performance/quality in current games, and the Glide advantage). My first Nvidia card was the TNT2 Ultra from Guillemot (whatever happened to them?).

Yeah, nvidia's first truly great card was definitely the TNT2. I should point out that TNT2 had better 32bit performance than the Rage128.

Geforce 256 - The first version with SDR was a huge disappointment.

Good thing the DDR version was out 3 months later ;)

Geforce2 - amazing performance, but it sacrificed image quality, and had crappy 2D quality.

gf2 didn't have any worse quality or 2d quality than any of nvidia's previous cards.

Geforce3 - fantastic new technology, but suffered severe driver problems for several months. By now Nvidia's dominance was cemented.

What driver problems? I owned a gf3 from the very beginning and experienced no driver problems.


Basically, what I have seen and heard leads me to believe that every single product nvidia churned out was better in many regards than their previous products, with the gfFX lineup non-withstanding.

extreme_dB
07-07-03, 09:42 AM
For the TNT2 comments, I was describing the strengths of the competition. The TNT2 was the perfect balance between all those aspects.

The GF2 had noticeably worse 2D/3D IQ than other cards at the time (Radeon, Voodoo 5500). I liked my Guillemot's TNT2's quality much better than my Creative GF2's, so I didn't mentiom 2D as a flaw for TNT2. :)

I remember the original GF3 having lots of IQ and stability issues for a while when it was launched. Old reviews should mention that (sorry, don't feel like searching right now). But I never owned one so I'm only parroting what I read.

And that's history from my own viewpoint. :)

reever2
07-07-03, 11:11 AM
Originally posted by StealthHawk
gf2 didn't have any worse quality or 2d quality than any of nvidia's previous cards.


I beg to differ. My gf2MX had horrible 2d quality and speed compared to my voodoo2, TNT, and even my onboard video. It was even worse when it came to ASCII things and DOS windows, i could actually see the background being drawn 4 lines at a time! Not to mention the 2-3 second pause when moving my selection in dos menus

StealthHawk
07-07-03, 07:54 PM
Originally posted by reever2
I beg to differ. My gf2MX had horrible 2d quality and speed compared to my voodoo2, TNT, and even my onboard video. It was even worse when it came to ASCII things and DOS windows, i could actually see the background being drawn 4 lines at a time! Not to mention the 2-3 second pause when moving my selection in dos menus

Voodoo2 has absolutely nothing to do with 2d, it was a 3d add-in card.

Also, just because your particular gf2mx had poor 2d quality does not mean that ALL cards had that problem. The 2d quality is the direct consequence of the filters used by the card manufacturer, if a few chose to use cheap filters then 2d quality suffers. I have not heard that ALL gf2 manufacturers used junk filters. Who knows, maybe they did. I would expect more gf2mx's to have junk filters because they are budget cards, so manufacturers skimping on costs makes more sense there. Regardless, my point is that it is not nvidia's fault. 2d quality was entirely up to the board manufacturer.

StealthHawk
07-07-03, 07:59 PM
Originally posted by extreme_dB
I remember the original GF3 having lots of IQ and stability issues for a while when it was launched. Old reviews should mention that (sorry, don't feel like searching right now). But I never owned one so I'm only parroting what I read.

I only remember Anand saying that in his gf3 preview. Like I said, I don't remember hearing anyone complaining about IQ or stability in forums, in reviews, and I didn't experience any problems either.

extreme_dB
07-07-03, 08:06 PM
Was nvnews around when the GF3 was introduced? I can't remember when I first knew about this site.

StealthHawk
07-07-03, 08:25 PM
Originally posted by extreme_dB
Was nvnews around when the GF3 was introduced? I can't remember when I first knew about this site.

nV News was around since at least 1999, although I didn't know of the site back then ;)

Wester547
07-07-03, 09:53 PM
OK, so maybe Im' a bit inaccurate. But at least I pointed out the major points of NVIDIA's 3-D graphics accelerators, not every single detail. And is the radeon 9500 pro also fully dx9 compatible? (just wondering because one of my machines has the card installed, very insipiring performance, very good image quality, the only problem is ATI's driver issues). And I have a variety of systems, which are:

1rst one:

Windows 98, 1rst edition
Intel Pentium II, 350MHZ
288MB of PC100SDRAM
NVIDIA GeForce 2 MX 400/400, 64MB VRAM
Sonic Impact A3D audio
One 8GB hd, one 16GB hd
DirectX 8.1b(have probs with 9 on this machine)
56kModem, soon broadband
ETC

2nd one:

Windows 98, Second Edition
Intel Pentium III, 1GHZ
512MB of PC700 RDRAM
NVIDIA GeForce GTS/PRO, 64MB DDR VRAM
Sound Blaster Live! Value Audio
One 40GB HD, one 35GB HD
Direct-X 8.1b(afraid of trying Dx9)
56kModem, soon broadband
ETC

3rd one:

Windows XP, Home Edition
Intel Pentium 4, 1.8ghz
512MB of PC800 RDRAM
NVIDIA GeForce 3 Ti500, 64MB DDR VRAM
Sound Blaster Live 5.1 3-D Sound Audio
Two 40GB HDS
Direct-X 9
56kModem, soon broadband
ETC

4TH One:

Windows XP, Professional Edition
Intel Pentium 4, 2.66ghz, 533mhz bus
1GB of PC1066 RDRAM
ATI Radeon 9500 Pro, 128MB DDR VRAM(upgraded from NVIDIA GeForce 4 Ti4200, 64MB DDR VRAM)
Sound Blaster Audigy Gamer Audio
Two 120GB HDS
Direct-X 9a
56kModem, soon broadband
ETC


Not too bad, eh???

Geforce4ti4200
07-07-03, 11:50 PM
riva128=very fast but 16 bit only and dithers quite badly. almost as fast as a tnt!

tnt thru tnt2u=great cards, 16 bit looked like 32 bit, at least for me, thats how good they were

geforce sdr=slower than a tnt2u in many games, couldnt oc, overheated, crap.

Skuzzy
07-08-03, 07:22 AM
My bad, the 9500 and 9600's offer the same level of DX9 features as the 9700 and 9800 products.

StealthHawk
07-08-03, 07:54 AM
Originally posted by Wester547
And is the radeon 9500 pro also fully dx9 compatible?

No card out is fully DX9 complaint. All the R3xx cards(9500, 9600, 9700, 9800) are more DX9 "compatible" than any of nvidia's cards. It should be pointed out that while nvidia hardware may support many DX9 features, a lot of them aren't enabled in the drivers yet.

jAkUp
07-08-03, 11:02 AM
did diamond multimedia ever make an nvidia card??? they made a few 3dfx cards, and they were my favorite manafacturer.

Hanners
07-08-03, 11:13 AM
Originally posted by jAkUp
did diamond multimedia ever make an nvidia card??? they made a few 3dfx cards, and they were my favorite manafacturer.

Yes, they did make TNT-based cards for a while if I remember correctly.

StealthHawk
07-08-03, 07:51 PM
Originally posted by jAkUp
did diamond multimedia ever make an nvidia card??? they made a few 3dfx cards, and they were my favorite manafacturer.

They made Riva128, TNT, and TNT2 cards. I don't know if they made any nvidia cards after that, they started switching to S3 cards, and eventually were bought out by S3.

Also, I don't remember who made the Edge3d card(NV1), it mght have been Diamond but I really don't remember.

CaptNKILL
07-08-03, 08:44 PM
One small correction:
"1999-NVIDIA GeForce 256:

.....It also featured DDR-SDRAM, and up to 32MB, with up to 22 million transistors, and a 200mhz DDR-SDRAM core, with a fast core speed...."

The Geforce 256 could have up to 64Mb of RAM. I remember going to EB and drooling over the 64 meg Geforce 256 boxes that said in big huge letters DDR.

Heh, that actually MEANT something back then. Now, there is 64bit DDR which is the same speed as 128Bit SDR... but they still say "DDR" on the box.. why do they always try to cheat the customers?

Wester547
07-08-03, 11:21 PM
How do I change the text (E.G: "RIVA 128, GEFORCE) below my name in the forums?

The Baron
07-08-03, 11:22 PM
You post enough, you can change it. ;)