PDA

View Full Version : NV34: Precisions on Intellisample support


Uttar
04-09-03, 08:57 AM
Sounds like the NV34's Intellisample support isn't *so* bad.
It supports everything in Intellisample, but Color & Z compression. This would imply it supports gamma correction, fast Z/Color clear, adaptive AF, and everything in LMA II ( but Z compression, of course )

Just thought you'd like to know that little precision :) I haven't seen any review confirm the Gamma Correction part.
IMO, it's quite good news: the NV34 is slated, according to rumors ( and this part isn't official ) to not be replaced soon unlike NV30/NV31 - so its quality is quite important, really.
For $79, the 5200 Regular seems to be a quite good deal... :)

And no, I wasn't paid by nVidia to say that, it's really my opinion.


Uttar

Unit01
04-09-03, 09:28 AM
i'm not so sure about the very good deal when it comes to the 5200 :/
Will the price really be 79$???? as when we buy and not a reseller?

Myrmecophagavir
04-09-03, 09:57 AM
I heard it wouldn't support the xs antialiasing modes? Not that those would be usable anyway. But is that correct?

Uttar
04-09-03, 10:16 AM
Originally posted by Myrmecophagavir
I heard it wouldn't support the xs antialiasing modes? Not that those would be usable anyway. But is that correct?

AFAIK, the FX 5200 does support the xS modes - but you're correct they probably aren't usable anyway.

As for the $79 price...
http://www.pricewatch.com/1/37/5113-1.htm

There's already a 64MB 5200 Regular for $81 - and it's barely available yet. I'd guess prices are going to drop a little more.
The $79 price is for the bare minimum, though: 250/200, 64MB.


Uttar

MuFu
04-09-03, 10:42 AM
Originally posted by Uttar
Just thought you'd like to know that little precision :) I haven't seen any review confirm the Gamma Correction part.

Have you seen a review that confirms it in NV31/NV30?

IMO, it's quite good news: the NV34 is slated, according to rumors ( and this part isn't official ) to not be replaced soon...

I very much doubt they'll replace it soon. It's easily the strongest product in the range (5200 Ultra aside) from nV's point of view, IMO.

MuFu.

Uttar
04-09-03, 12:00 PM
Originally posted by MuFu Have you seen a review that confirms it in NV31/NV30?

I gotta admit I never seen IQ comparaisons made by a reviewer which proof it - many reviews did post that nVidia claim it, though.
Remember it's "gamma correction" - not "Gamma Corrected Antialiasing" - it's only for PS stuff.
I'm not even sure you'd see the difference much with current PS programs - could be wrong on this one, though.
Would be funny if that's what the IQ problems in 3DMark 2003 are, though :D But that's nearly impossible...

I very much doubt they'll replace it soon. It's easily the strongest product in the range (5200 Ultra aside) from nV's point of view, IMO.

MuFu.

Agreed. That's exactly why I said that it is slated not to be replaced soon ;)


Uttar

MuFu
04-09-03, 12:39 PM
Yeah... I know what you said, buddy. :D

Was simply agreeing.

MuFu.

Dazz
04-09-03, 05:45 PM
From what i have seen the FX5200 runs slower then the Geforce 4MX :eek:

Martrox
04-10-03, 06:53 AM
Well.... here's a semi comparison of the lower clocking 5200 vs the ATI 9200.... doesn't look so good

http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/9200_geforcefx/001.htm

stevem
04-13-03, 05:49 AM
According to Nvidia docs, 5200 does not support intellisample...

Uttar
04-13-03, 06:28 AM
Originally posted by stevem
According to Nvidia docs, 5200 does not support intellisample...

nVidia's docs aren't very precise, you know.
This info comes from Brian Burke.


Uttar

stevem
04-13-03, 06:44 AM
Patently so, in many ways. However, "Intellisample technology is only available with the GeForce FX 5800 and 5600 models." is fairly unambiguous. Of course, it doesn't mean it's accurate & may have reflected prior marketspeak considerations...

BTW that $79 board will be 64MB 64-bit DDR interface.

Dazz
04-13-03, 07:01 AM
Yeah it's a pretty bad deal thats for sure $79 3.2GB/sec memory bandwith, for a DX9 card that will struggle with DX8.1 games lol

Uttar
04-13-03, 07:33 AM
Okay, so the $79 FX5200 isn't so good. But it ain't THAT bad, either. As long as you don't enable any AA, it'll run fine. Of course, once you actually do, then it's another matter...
And damn, let's see the bright side of things: That thing got so much shading power, *comparatively* to its bandwidth, that it might actually be logical to use DX9 shaders on it :P

But yeah, the $79 FX5200 is really more of a "get DX9 real cheap" type of product - it's not about performance, it's about k3w1 "support" of a p0w3rfU1 API...
Read: It's the type of product OEMs love.

As for the $99 FX5200, that's already much more serious. I'd bet too that you could easily overclock the core to something like 300Mhz - heck, if you could do it with a Ti4200, why not with a FX 5200? Well, yeah, there's cooling... I guess it'll depend on the cooling solution a little.

Okay, so basically, stop complaining. The $79 product is good for OEMs - I'd sure prefer to have that over a GF2 MX200 or a TNT2 M64 - and the $99 one will look real good compared to the 9200 in Doom 3. No, really, I'm ready to bet it will.

And for the last time, I wasn't paid by nVidia to say this. I'm just annoyed by everyone here saying those cards are bad while they really aren't. The $79 is "not so good" while the $99 is "pretty good", IMO.


Uttar

StealthHawk
04-13-03, 07:46 AM
Originally posted by stevem
Patently so, in many ways. However, "Intellisample technology is only available with the GeForce FX 5800 and 5600 models." is fairly unambiguous. Of course, it doesn't mean it's accurate & may have reflected prior marketspeak considerations...

BTW that $79 board will be 64MB 64-bit DDR interface.

it seems pretty clear to me that Intellisample is just nvidia's marketing term for an all encompassing set of features such as the FSAA engine, gamma corrected shaders, Z compression, color compression, and adaptive AF.

as such, you either have Intellisample or you don't.

all that Uttar is trying to clarify is that yes, the gfFX5200 does not have full Intellisample support. but it does have some support.

Kruno
04-13-03, 08:47 AM
all that Uttar is trying to clarify is that yes, the gfFX5200 does not have full Intellisample support. but it does have some support.


I thought this was common knowledge? :eek2:

stevem
04-13-03, 08:57 AM
Originally posted by Uttar
Okay, so basically, stop complaining. The $79 product is good for OEMs - I'd sure prefer to have that over a GF2 MX200 or a TNT2 M64 - and the $99 one will look real good compared to the 9200 in Doom 3. No, really, I'm ready to bet it will.
Perhaps you're confusing me with someone else? (Had to re-register after posting absence.) Just pointing out confusing Nvidia official line. Perhaps a few winkies for sarcasm were in order...? I have argued that the 5200 at it's price points was a useful product. Unfortunately, I don't see TNT2M64, MX2 & MX4 being deprecated soon enough, although perhaps when LP 5200 64MB PCI SKUs are released...;)

Doom3, eh? At what level of performance vs IQ? Or are you arguing free IQ given the low performance anyway - double-sided stencil apart...

Interesting concept - clarity vs marketspeak as per Soundstorm compliance program...

stevem
04-13-03, 09:08 AM
Originally posted by K.I.L.E.R
I thought this was common knowledge? :eek2:
:)

BTW, are you still lazy...?

StealthHawk
04-13-03, 09:25 PM
Originally posted by K.I.L.E.R
I thought this was common knowledge? :eek2:

by reading some of the responses to this thread it obviously isn't.

Kruno
04-13-03, 11:28 PM
Originally posted by stevem
:)

BTW, are you still lazy...?

Sorry to go a little OT: http://www.beyond3d.com/forum/viewtopic.php?t=5217

Does that answer your question? :)

Back OT:

The NV34 would make use of most performance enhancing DX9 and GL 2.0 features. ;)

stevem
04-15-03, 05:29 AM
Especially via MS DX9 HLSL & ARB2.0 spec. :)