PDA

View Full Version : GeForce FX: Will it finally live up to the hype?


Pages : [1] 2

Falkentyne
12-18-02, 09:58 AM
The Hyper machine never dies....

Some of you may remember the TNT hype machine, where it was billed as a Voodoo2 SLI killer, where the card actually turned out to be about as fast as a single V2, more in some instances, less in others but never even close to SLI. The TNT2 actually had the ORIGINAL specs that the TNT was supposed to have (which is why that card was labeled as an SLI killer, before it's release...it would have been if it had its original specs).

And then there was the GF3.
Labeled as the next generation of graphic technology....
Some Nvidia rep in an interview, and maybe a major game developer, said that the GF3 would have as significant impact on the graphic industry, not seen since the introduction of the original 3dfx Voodoo1 card. <-----that's a rather big claim.

Then the GF3 came out. Not only was it sometimes slower than a GF2U with the first iteration of drivers, but it did nothing for current games, until later drivers helped bandwidth savings. And games that used the GF3 unique features (vertex/pixel shaders) did not even come out until close to the GF4 launch. (Aquanox is an exception, but that was not what you would call a "game", but was a great benchmarking tool).

But where was the GF3's impact so significant that it matched the Voodoo?

Let's see...
Voodoo bought a proprietary and unique API, that was EASY To code for, and fast (Glide), also supported D3D, was finally a *STANDARD*, also allowed you to use a second card that was good in 2D, finally allowed the first generation of accelerated games to be played at 30+ FPS, mipmapped, bilinear filtered (some previous 3D cards did not even support bilinear), perspective corrected, and um....i think Multitexturing wasn't available until the Voodoo2 and it's two TMU's....

I'm not a video card guru so I don't remember all the features, though.

Quake2 was the first game that really bought the Voodoo1 to its knees, and Unreal was the real nail in the coffin. Sweeney had to make a lot of compromises just to get Unreal running adequately on a Voodoo2, which was the best card out at the time :(

Well, without getting off subject,
back to GF3...

So, what "era" did GF3 usher in? What significant impact?
2048x2048 (or was it 4096x4096) textures? That was done on the GF256
More speed? Every Nvidia card has had a speed boost from the previous, with the proper drivers.
T&L? Done on GF256

The only real thing new was the LMA, which evolved even more in the GF4, which helped limit the bandwidth burden, but i dont consider that a truly wonderous innovation...it's not like the framerate doubled just because of it :)

And of course, the pixel and vertex shaders, which added even more detail to the polys. But for something so innovative, it took a lot of time for games to come out that actually used these features. But how was this as "big" as the first mainstream accelerator (not the *cough* s3 virge *cough* decelerator (graphics IMPROVER)........

Most of what was used in the GF3/GF4, as far as what the average gamer could tell, was the SPEED BOOST using real hardware antialiasing (although it was STILL too slow on a GF3 to be playable at 1024x768@32 w/4x in some recent games; even Quake3 was not as fast as you might want on some levels), combined with anisotropic filtering. The faster speed is what made more of a difference, IMO, with these cards, than any handful of games with vertex and pixel shaders. But clearly, this is something to be expected out of new hardware... it isn't a "3dfx acceleration" level type of jump; the faster speed occurs on every new generation of hardware, step by step. Just compare the R9700 to the 8500.

I'm still trying to figure out where or what the "Voodoo" industry effect of the GF3 is. If anything, the Geforce FX/R350 will be MORE of an innovation compared to the previous generation, than the GF3 was to the GF2U, which preceded it.

Now someone can say that the GF3 took the first pioneering steps to seriously improving graphics in games, and this is taken to the next level with the R9700 and GF FX, but where is the large immediate effect of the card?

Anyone agree?

That's like saying the Pentium Pro made a huge impact on the computing world, because an enhanced PPro chip was used in the P2 and even stronger P3...(when the Ppro was notoriously bad for running 16 bit apps)....

SurfMonkey
12-18-02, 10:03 AM
At the end of the day which series of cards gripped the publics imagination (and wallets) and still is one of the strongest brand names in the gfx card business? Does it really need to live up to the hype? Apart from amongst the true enthusiasts and hard core gamers I don't think so.

Falkentyne
12-18-02, 10:18 AM
What does that have to do with the main point of what I said?

This isn't about brand name recognition!

where does the impact of the Geforce 3's innovations, compare to the first MAINSTREAM graphic accelerator that was truly an accelerator, and that helped make the market a true market?

My point was about hyping something beyond what it truly is.
The GF3 in no way lived up to the hype....it was hyped up like MAD. I never said it wasn't a good card. It was just hyped up to be more than what it really was, long before its release.

Now if the cycle had gone directly from a GF2U to a TI 4600, THEN sure. the impact would have been legendary..going from totally unusuable FSAA to 1280x1024 4x FSAA, (or at least 1024x768 4x FSAA and some aniso in everything except UT2003 and maybe BF1942), truly bowel-loosening framerates, plus the vertex and pixel shaders :)

Please don't accuse me of being a "3dfx troll" (which by the tone of your post, almost sounds like you're doing :) before you carefully read what I'm saying. I wasn't praising 3dfx; I was questioning the Nvidia hype machine, not the GF3.

Oh, and for the record, the PowerVR was a SUPERIOR chip to anything 3dfx had out...it just had terrible support and was hard to code for, but was way ahead of its time.

call me Critical Bill, but I'm critical of hype that doesn't live up to what it claims to do.

Bigus Dickus
12-18-02, 10:50 AM
I'd say that the R300 lived up to the hype. It offers the kind of speed increase that has not been seen since the voodoo sli days.

Had GFFX been out first, it would have lived up to the hype. Now, it's just a "me too" part, as good as it might be.

SurfMonkey
12-18-02, 10:50 AM
I'd go with the PVR statement, I had an Apocalypse 3DX for a while :D

I just think that it's hard to judge where the technical advances start and the pure PR bull ends. We are still waiting for the games tech to catch up with the hardware. That is mainly because it took time to implement the changes. We should just start playing DX8.1 games when the DX9 cards are mainstream.

Of course that may change, it is now easier than ever to keep up with the change in tech, tools like Cg will make implementing stuff tons easier.

And has any card ever lived up to the hype? People will say R300 now, but if that had come out after the FX would it have been as amazing? Probably not.

With the FX we'll still have to wait and see. I think for gamers it will be better than the R300-350 at some things and poorer at others. It may scale better in the long run.

For the offline rendering crowd it's going to be a godsend, and that's a pretty profitable market for nV to get into.

At the end of the day you're always going to get hyped, but there's not much use complaining about it. Just enjoy it for the pretty pictures and then lavish yourself in the cheap prices and good games and then praise the hype god, for at the end of the day only the gullible and the PR people actually believe any of it... ;)

Falkentyne
12-18-02, 11:15 AM
Yeah, the R300 is truly an amazing piece of silicon. I woudn't have thought it was great, until I saw that I get MOUSE LAG on my TI 4600, in UT 2003 @1024x768, on the DM-Inferno map (I never get any mouse lag on that other so-called most intensive map--DM-Serpentine, although that is defintely the most CPU intensive map, with all the models stuff flying around and the narrow quarters, esp with more than 6 people...

R9700 probably at least doubles the framerates of a TI 4600 on dm-inferno...

The reason I haven't bitten is because (1) i only upgrade every other generation (unless I see a really good deal or typo and can sell this card for a bunch of money), and (2) I'm VERY skeptical about ATI's driver quality....and not being able to use FSAA in 16 bit modes is terrible for old 640x480 era D3D titles...

If it weren't for the drivers (and of course, the bugs in the actual hardware--Revision 3 (1.3?) is required to fix the "stuttering" problems), I might have considered grabbing one.

Chalnoth
12-18-02, 12:36 PM
Originally posted by Falkentyne
I'm still trying to figure out where or what the "Voodoo" industry effect of the GF3 is. If anything, the Geforce FX/R350 will be MORE of an innovation compared to the previous generation, than the GF3 was to the GF2U, which preceded it.

Now someone can say that the GF3 took the first pioneering steps to seriously improving graphics in games, and this is taken to the next level with the R9700 and GF FX, but where is the large immediate effect of the card?

The primary effect that the Voodoo had in the video game world was that it paved the way for truly 3D-accelerated titles. By showing to the general public, for the first time, that 3D acceleration could really be worlds better than software rendering, the original Voodoo hailed in a new generation of video games.

And the GeForce3? Well, it would have done the same thing, if nVidia had released a low-end NV2x card by now. Unfortunately, they haven't, so game developers have been hesitant to really code for the card. Since this looks to be changing for the NV3x cards, it certainly appears that it will be the NV3x series that will hail in a new generation of 3D games.

Still, the GeForce3 really did pave the way. It was the first card released with programmable pixel and vertex shaders. It really did begin a paradigm shift in the industry, shaping a new idea of how to render realtime 3D graphics, an idea that has continued to evolve.

Bigus Dickus
12-18-02, 01:46 PM
Originally posted by Chalnoth
And the GeForce3? Well, it would have done the same thing, if nVidia had released a low-end NV2x card by now.

ATi did that for them. ;)

sancheuz
12-18-02, 02:09 PM
We dont want the card to provide the performance immediately, we want long term performance. The assurance that games 2 or 3 years from now will still be able to run quite smoothly one the gffx. The geforce Fx will live up to that hype. As soon as developer start using cg, and more public buying good video cards, i assure you, the games we will see for computer, will blow anything else, including consoles away.

Chalnoth
12-18-02, 07:23 PM
Originally posted by Bigus Dickus
ATi did that for them. ;)

Not until very recently, and not nearly soon enough to make any difference, given that nVidia is planning making the NV31 and NV34 available by April (One of which should be a low-cost chip...rumored to be the NV34).

Steppy
12-18-02, 08:30 PM
Originally posted by Chalnoth
Not until very recently, and not nearly soon enough to make any difference, given that nVidia is planning making the NV31 and NV34 available by April (One of which should be a low-cost chip...rumored to be the NV34).

Either way, DX8 came out in fall of 2000. The GF3 came out spring 2001. Not until early 2002(with the ti series and the 8500 dropping so low) was there a budget DX8 card(even later if you go by retail pricing). Flash forward to DX9, BEFORE the API is even released there are already budget DX9 cards on the market...DX9 SHOULD have nearly a 2 year developmental lead(mostly thanks to the BS GF4MX line still being DX7...that was a move that even hardcore NV fans should have been peeved about because it kept DX8 out of the mainstream longer than it needed to) than DX8 did...we'll be seeing DX9 titles in about the same amount of time as it took for budget dx8 cards to be RELEASED.

Bigus Dickus
12-18-02, 09:14 PM
Originally posted by Chalnoth
Not until very recently, and not nearly soon enough to make any difference, given that nVidia is planning making the NV31 and NV34 available by April (One of which should be a low-cost chip...rumored to be the NV34).

Yes, what ATi does never makes any difference. Doesn't matter if they release a budget DX8 card a year earlier, or a high performance DX9 card half a year earlier, or a value DX9 card a few few months earlier... it's only what nVidia does that actually makes any difference at all. After all, they are the only company anyone cares about, and buys products from.

StealthHawk
12-18-02, 09:34 PM
Originally posted by Bigus Dickus
Yes, what ATi does never makes any difference. Doesn't matter if they release a budget DX8 card a year earlier, or a high performance DX9 card half a year earlier, or a value DX9 card a few few months earlier... it's only what nVidia does that actually makes any difference at all. After all, they are the only company anyone cares about, and buys products from.

nvidia has more market share. what doesn't make sense about that? and has been stated many times over by devout ATI defenders, devs have been coding for nvidia cards as their primary audience. therefore, even though you were being sarcastic, you're absolutely right when taken literally. it doesn't matter what ATI does until ATI replaces nvidia as the dominant brand that games are programmed for.

Bigus Dickus
12-18-02, 10:49 PM
Originally posted by StealthHawk
nvidia has more market share. what doesn't make sense about that? and has been stated many times over by devout ATI defenders, devs have been coding for nvidia cards as their primary audience. therefore, even though you were being sarcastic, you're absolutely right when taken literally. it doesn't matter what ATI does until ATI replaces nvidia as the dominant brand that games are programmed for.

No, I don't think it's absolutely right when taken literally. This is a prime example of people trying to make an issue black and white when in reality it's a shade of grey. Yes, nVidia has more market share. Yes, developers code more for nVidia products. Is it a 95/5 market split? No. Is it 100% of development on nVidia platforms? No.

Is what nVidia does the only thing that matters, as was Chalnoth's inference previously and is yours here, even though their market share lead is not nearly so substantial as some like to make it seem? Of course not. ATi might be behind in market share, but they are a stong competitor, and there are plenty of consumers out there buying the "irrelevant" ATi products.

That was my point. I thought the sarcasm would get that point through. Guess not.

StealthHawk
12-19-02, 01:09 AM
i agree to a point that ATI is not insignificant. but of course what ATI does alone isn't going to matter much. just as what nvidia decides to do alone isn't going to matter.

is having a high end DX9 card out 3 months before DX9 is out going to make DX9 games come any sooner? no, not really. by the same token is ATI having a lowend DX8 card going to usher in DX8 games for all? hell no, not while most people still own gf2mxs and aren't upgrading yet.

AGP64
12-19-02, 01:31 AM
Originally posted by StealthHawk
i agree to a point that ATI is not insignificant. but of course what ATI does alone isn't going to matter much. just as what nvidia decides to do alone isn't going to matter.

is having a high end DX9 card out 3 months before DX9 is out going to make DX9 games come any sooner? no, not really. by the same token is ATI having a lowend DX8 card going to usher in DX8 games for all? hell no, not while most people still own gf2mxs and aren't upgrading yet.


mmm. dunno

If I open up a computer store folder or large electronic chain folder (in NL) I see a lot more new computers shipping with the R9000 card isntead of the GF4MX series. Clearly it will take some time but I can certainly already see a big difference with 4 to 6 months ago.

6 months ago most computer retail stores sold only NVidia based cards. Now I see an ~50%/50% split. This will only increase between now and april (NV31 launch) and june (NV31 wide availability).

Also Ati has almost no competition with Xmas will have an impact.

just my 0.02

Falkentyne
12-19-02, 03:29 AM
Originally posted by StealthHawk
is having a high end DX9 card out 3 months before DX9 is out going to make DX9 games come any sooner? no, not really. by the same token is ATI having a lowend DX8 card going to usher in DX8 games for all? hell no, not while most people still own gf2mxs and aren't upgrading yet.

No, but having a high end DX9 card will improve UT2003 framerates by as much as 75% on video intensive maps, like DM-inferno and Anatalus(sp), compared to TI4600.

I didn't realize just how serious this is, until I started getting mouse lag at 1024x768@32 on my TI 4600, NO aniso and no fsaa, on dm-inferno ! I haven't gotten mouse lag since trying to play games at 1600x1200 with FSAA, or since using my GF2 GTS at 1024x768 on a recent game (as of the beginning of this (1992) year), or the occasional new title with 1024x768 4x FSAA.

If you are a diehard UT2k3 player and didn't care too much about older stuff, there's no reason why you should NOT have a R9700, provided it's revision 1.3. It's THAT good in UT (tho i'm waiting for NV30 or NV35).

Mouse lag is a sign of a highly stressed video card....your framerate could drop down to the teens if you were heavily CPU limited, but not bandwidth limited, and you wouldn't get any mouse lag whatsoever. But why don't I get mouse lag on my 3dfx cards?

netape
12-19-02, 07:03 AM
Originally posted by AGP64
april (NV31 launch) and june (NV31 wide availability).

Where did you get that june wide availability? IMO, they "will" (at least I would if I were them) have a budget card before june (wide availability), budget cards = money. What's the use of GFX for Nvidia if they can't sell them :confused: (High end cards are 9% of the market). Just my 2 cents...

EDIT: typos and all...

AGP64
12-19-02, 08:42 AM
Originally posted by netape
Where did you get that june wide availability? IMO, they "will" (at least I would if I were them) have a budget card before june (wide availability), budget cards = money. What's the use of GFX for Nvidia if they can't sell them :confused: (High end cards are 9% of the market). Just my 2 cents...

EDIT: typos and all...

I assume june based on the two points:

1) I live in Europe, which always sees the cards later than the US :mad:
2) Earlier Nvidia and Ati product launches have shown two to three months between launch and wide availability

netape
12-19-02, 09:18 AM
Originally posted by AGP64
2) Earlier Nvidia and Ati product launches have shown two to three months between launch and wide availability

But now Nvidia is playing catch-up. They need to hurry up with that nv31 (*cough* Radeon 9500 (pro) *cough*) or else they will lose a lot of customers (including me), we don't want to wait forever... ;) :D

AGP64
12-19-02, 09:40 AM
Originally posted by netape
But now Nvidia is playing catch-up. They need to hurry up with that nv31 (*cough* Radeon 9500 (pro) *cough*) or else they will lose a lot of customers (including me), we don't want to wait forever... ;) :D

Than why wait any longer if you can get your hands on a R9500pro / R9700 non pro now! ;)

I am a happy camper instead of a (un)happy waiter:D

netape
12-19-02, 09:50 AM
Originally posted by AGP64
I am a happy camper instead of a (un)happy waiter:D

But they will get cheaper when GFXs are in the selves :D And on the other hand you can see how GFXs perform :D I'm a happy waiter :D

Pixel Pop
12-20-02, 04:58 AM
Falkentyne says "The Hyper machine never dies.... "

I reckon you should go on a long holiday where there are no billboards, TVs, magazines, newspapers, internet access, movie theatres, anything that could tell you about the great features of the GeForce Super FX and how many fps it can do in Doom 6.

See if you care about the hype machine then.

You're like a child that cries because it's icecream isn't sweet enough.

:)

vvolkman
12-20-02, 08:01 AM
Originally posted by Pixel Pop
Falkentyne says "The Hyper machine never dies.... "

See if you care about the hype machine then.

You're like a child that cries because it's icecream isn't sweet enough.

:)

We have a strange mix of people here: game developers, fanboys, CAD administrators, game junkies, and so on. So you can't neccessarily ascribe a single motive to any one person on the forum(s).

I can tell you from what little I know about HCI (Human Computer Interaction) studies is that people will only viscerally notice an upgrade if there is a halving of latency (i.e. a doubling of speed). Below that is the "ho-hum" reaction, or the not-quite-sure how much better it is. I have to laugh when I read the comments from people who are excited because they got 100 more points on a 3DMark score by upgrading a driver, which is amounting to a 1% improvement. If you have to read the FPS meter to see if its "better", it ain't really that much better.

Getting back to the thread, unless there is a 2X increase from whatever the average legacy card is out there (and that is subject to a lot of debate I'm sure) it will be another Yawn of A New Era. Or maybe we all will be watching Final Fantasy rendered in realtime on our screens....

Happy Holo-days.

Unique_user_id
12-21-02, 12:49 AM
from what i understand, getting a 5-10% increase in speed/frames per second is noticible. unless you are a hardcore gamer.
so like the previous poster said, if its not 2x then yawn or something like that


still waiting for bumb mapping per pixel ASCII acelleration.