PDA

View Full Version : GeForce FX - disappointing performance


Pages : [1] 2 3 4 5 6

LuminousFriend
11-19-02, 09:41 PM
This is disappointing.



http://www.tech-report.com/etc/2002q4/geforce-fx/index.x?pg=1

Smokey
11-19-02, 11:01 PM
What exactly are you basing this on? Benchmarks? Beta drivers? The facts are that we dont know the performance of this card yet do we? I myself will not be so quick to judge, and if your looking for games that will take advantage, just go look at the launch stream on Nvidias website. One game that stood out, was Everquest2. Even though Im not going to be playing that game, the player and creature models are incredible, even looking as good as Nvidias own demos. Sony Online also said that even with the models as good as they are, they expect 30-50 models on screen at a time without slow downs. The other games that I expect are going to be up there as the best looking will be DoomIII, which will have options that take advantage of the GF-FX, as JC has already stated. DeusX2, which will also be full of shadows, bumpmapping and the likes, may also come with some extra options for the GF-FX. Yes this card will be fast with todays games, one benchmark was Q3 at 2048@171 FPS. I dont know why people will not be happy with this card even if its only slightly faster than the 9700? The 9700 was a big step over the GF4, and if this card is 2.5-3 times faster than the GF4, I think that is quite amazing, dont you?

LuminousFriend
11-19-02, 11:27 PM
I dont know why people will not be happy with this card even if its only slightly faster than the 9700?


By the time it's released, ATI will have something better to offer (the R350 or a souped-up version of the 9700).

Nvidia is one generation behind and it will stay behind, at least for the forseeable future.

Things can turn around quickly in this industry, though. Just look at ATI now.

volt
11-19-02, 11:30 PM
Originally posted by LuminousFriend
By the time it's released, ATI will have something better to offer (the R350 or a souped-up version of the 9700).

Nvidia is one generation behind and it will stay behind, at least for the forseeable future.

Things can turn around quickly in this industry, though. Just look at ATI now.

OMG, that is so funny :rolleyes:

Megatron
11-19-02, 11:30 PM
Hahaha...slapped a "dustbuster" on the card..lol...

LuminousFriend
11-19-02, 11:35 PM
OMG, that is so funny


If I am so obviously wrong, it should be easy for you to prove it.


I am waiting.

AngelGraves13
11-19-02, 11:36 PM
what is it with people thinking their graphics cards are gonna be trash in a few months.....a Geforce 4 will last you another 2 years. So anything better than this will last you maybe 3 to 4 if you really want to push it.

netviper13
11-19-02, 11:51 PM
Originally posted by LuminousFriend
If I am so obviously wrong, it should be easy for you to prove it.


I am waiting.

Hmm...

.13 micron + fairly small die size = room to expand

128bit bus = easy expansion room

and other architectural advancements. .15 micron is at its limits, at least without making the core massize (even more massive than the 9700's), that's why nVidia is going to pull out ahead. They have already made that switch, and now they can cruise along with the evolutionary speed improvements and such.

That's the way the industry works, innovation followed by evolution for 3-4 years, then innovation again followed by more evolution. NV30 is the current innovation.

Kruno
11-19-02, 11:52 PM
A Geforce 3 should last you for a while yet.
Nvidia is currently behind in features IMO. I would pick up a R300 than a Nv30 anyday (if both were offered for free).
R300 has PS that improves porno movies, which IMO is more important than extra speed I will never get considering all the games I play already run at maximum frame rate with vsync on.

AngelGraves13
11-19-02, 11:54 PM
(sush) we have kids here.......keep your porn to yourself. :D What movie is it by the way??

Kruno
11-19-02, 11:56 PM
John Carmack, hot and spicey videos. ;) :D

AngelGraves13
11-19-02, 11:57 PM
oh yeah....I can imagine you listening to his nerdy voice and getting turned on.......lol......

netviper13
11-19-02, 11:59 PM
oh baby, oh baby, lol. :D

Phyre
11-20-02, 12:08 AM
A lot of people are perplexed by the heatsink taking up a PCI slot. Correct me if I'm wrong, but don't the AGP slot and PCI slot 1 share an IRQ on most motherboards? If so, I have no problem giving PCI slot 1 to a cooling system.

Phyre

StealthHawk
11-20-02, 12:08 AM
Originally posted by LuminousFriend
By the time it's released, ATI will have something better to offer (the R350 or a souped-up version of the 9700).

Nvidia is one generation behind and it will stay behind, at least for the forseeable future.

Things can turn around quickly in this industry, though. Just look at ATI now.

if things can turn around so quickly as you put it, what makes you think nvidia will stay behind for such a long time.

you're using the most absurd logic i've ever heard of. first of all, you don't know when R350 will be released. second of all, you don't know when NV35 will be released. there's no guarantee that the NV35 will reign for any amount of time. and there's no way to know the R350 will reign when it's released either.

as for the next generation, with R400 and NV40 it will be a whole new ball game, with no way to forsee who will come out on top.

StealthHawk
11-20-02, 12:09 AM
Originally posted by 99 to Life
what is it with people thinking their graphics cards are gonna be trash in a few months.....a Geforce 4 will last you another 2 years. So anything better than this will last you maybe 3 to 4 if you really want to push it.

if you don't use AF or FSAA maybe it will last you another 2 years. Doom 3 won't be very playable with 4x FSAA and 8x AF.

Chalnoth
11-20-02, 12:12 AM
Or IF the R350 will be released.

ATI hasn't yet released a true refresh part. Based upon past history, ATI will probably release little more than a variant of the R300 with, say, DDR2 on a 128-bit bus instead.

Bigus Dickus
11-20-02, 12:19 AM
Originally posted by Smokey
What exactly are you basing this on? Benchmarks? Beta drivers?

Features.

NV30 was hyped to be 256 bit bus by a lot of people.
8 x 2 by almost everyone.

128 bit and 8 X 1 is more in line with what I suspected, but it's hardly exciting to be right.

So what else is dissapointing?

Lack of 16X AF. :rolleyes:
OGMS for 8X AA. :rolleyes:
Probably lack of displacement mapping. :rolleyes:

Sugar coat it all you want, but the NV30 isn't a revolutionary product by any stretch of the imagination. Those who think that process change = revolutionary or innovative are delusional. Did you think the Via C3 was a revolutionary processor, or just the same POS on a shrunken die? Right.

Process is just a means to an end. ATi chose a different means, but acheived the same ends half a year sooner. That is innovative. And you better believe that they are working on .13u pretty hard right now. Personally, I don't expect to see a .13u ATi part until late summer or early fall next year, but that will be in plenty of time to compete with the NV35. It looks like a DDR-II equipped R300 will be plenty competitive with the NV30 for now.

Bigus Dickus
11-20-02, 12:23 AM
Originally posted by Chalnoth
Or IF the R350 will be released.

ATI hasn't yet released a true refresh part. Based upon past history, ATI will probably release little more than a variant of the R300 with, say, DDR2 on a 128-bit bus instead.

Why a 128 bit bus? That makes no sense. Why would they take a step back in bandwidth for a refresh? Wouldn't that eliminate the 9500 Pro line? The core is already designed with a 256 bit memory controller... I suppose would just keep producing the same core and for some reason disable half the memory controller in their flagship product?

Just keeping you honest there. ;) I know you were only saying that was a possibility. And I'm only saying it isn't a logical one. :)

Other than that, I agree, a true refresh (core improvement) this spring is unlikely. I'm undecided though whether we'll see a R350 in the summer/early fall, or the R400 in late summer/fall. Kind of hung up on the progress of DX10 at this point, and whether ATi is determined to make the R400 a DX10 part or just a "super" DX9 part.

Chalnoth
11-20-02, 12:25 AM
And why would the R300 use a 256-bit bus in conjunction with DDR2? Such a thing would almost certainly require a significantly faster core speed as well to receive any benefit.

That is, the extra memory bandwidth is only useful if the memory controller(s) can actually make use of that much bandwidth.

So, if the R300 has a total of 512 bits in its memory controllers (which seems very likely, given the specs), it can't make use of memory clocked higher than the core speed. At least, it can't unless it's going to be on a 128-bit bus.

Smokey
11-20-02, 12:27 AM
Originally posted by LuminousFriend
By the time it's released, ATI will have something better to offer (the R350 or a souped-up version of the 9700).

Nvidia is one generation behind and it will stay behind, at least for the forseeable future.

Things can turn around quickly in this industry, though. Just look at ATI now.

Lets not forget that when ATI brought out the 8500 it wasnt faster than the GF3, Nvidia released the Ti500 just to be sure. Also wasnt the GF3 the first card to be using .15micron? Nvidia arent behind at all, they just didnt release a speed up of the GF4. The GF3 came out in March/April? Launched in February? ATI will launch the R350 in the first half of next year, then wait at least another month for it to hit shops. If they are moving to .13micron, they will need to redesign the hold chip. The only thing I see coming from ATI is a speed up of the 9700, with DDRII? And even though ATI have shown a 9700 running DDRII, I did read that it doesnt run at full spec because it cant. Maybe it can maybe it cant? (http://www.beyond3d.com/index.php#news3173)

I would expect to see ATIs next card around the sametime as the 9700 came out, summertime!

Bigus Dickus
11-20-02, 12:29 AM
Originally posted by netviper13
Hmm...

.13 micron + fairly small die size = room to expandExternal power (and a hefty draw at that) + Naquada powered hoover cooling = not much room for NV30 core to expand (speed wise).

128bit bus = easy expansion room128 bit bus = pin grid designed for 128 bit interface = major design change to go to 256 bit bus.

R300 + DDR-II ready memory controller + current DDR memory = easy expansion room. :)

I know, I know, you are really saying that by having a .13u process down nVidia's future cores have plenty of room to grow, and that they can add a 256 bit memory bus to future cores.

I think it's absurd though to think that ATi won't make the move to .13u in time to compete with the next nVidia core (late next year). Meanwhile, the NV30 "Ultra" is probably all we'll see for the better part of this year from nVidia, while ATi will probably offer a refresh of the R300 at least (if only in the form of DDR-II memory and higher core speeds).

Bigus Dickus
11-20-02, 12:32 AM
Originally posted by Smokey
Lets not forget that when ATI brought out the 8500 it wasnt faster than the GF3It was faster than the GF3 on average, IIRC, at least when it finally became available.

Nvidia arent behind at all, they just didnt release a speed up of the GF4.They did - NV28, which is right in line with their past three fall "refreshes."

StealthHawk
11-20-02, 12:35 AM
Originally posted by Bigus Dickus
They did - NV28, which is right in line with their past three fall "refreshes."

NV28 wasn't a "speed up." there was no gf4Ultra model that increased performance beyond a gf4Ti4600. we haven't even seen nvidia launch a gf4Ti4600 8x yet.

Bigus Dickus
11-20-02, 12:38 AM
Originally posted by Chalnoth
And why would the R300 use a 256-bit bus in conjunction with DDR2? Such a thing would almost certainly require a significantly faster core speed as well to receive any benefit.

That is, the extra memory bandwidth is only useful if the memory controller(s) can actually make use of that much bandwidth.

So, if the R300 has a total of 512 bits in its memory controllers (which seems very likely, given the specs), it can't make use of memory clocked higher than the core speed. At least, it can't unless it's going to be on a 128-bit bus.

I wouldn't expect ATi to release a part with 325 MHz core and 500MHz DDR-II memory on a 256 bit bus. I might expect something more like 400/400 or 425/425, depending on what they can squeeze out of the core and how big of a heatsink they are willing to use. A 30% speed increase across the board would not be trivial.

But 400 MHz core 1 GHz DDR-II on a 128 bit bus makes no sense. Less bandwidth and higher core speed than the current 9700? The obvious question is "why?"