PDA

View Full Version : No Z compression on NV34?


Pages : [1] 2 3 4

Uttar
03-06-03, 02:58 PM
EDIT: I'm going to investigate this further. I guess we'll see soon... I really wouldn't want to bash nVidia for something which just ain't quite true!

From nV News's preview:

The value oriented GeForce FX 5200 Ultra (325MHz/650MHz effective DDR) and GeForce FX 5200 (250MHz/400MHz effective DDR) are manufactured at 0.15-micron and contain support for DirectX 9 including vertex and pixel shader 2.0+. Both parts are lacking hardware assisted color and z compression.

Now, this is kinda a shock to me. No Z Compression?! This is really, really bad...
Don't hope to use FSAA on that card, that's for sure.

No Color Compression was expected, but this wasn't!

I'd still like to have a few details on this however.
What Z-related memory bandwidth saving techniques are in the NV34?
And is Early Z still in it?


Uttar

sebazve
03-06-03, 03:21 PM
the more i hear the more i realize that THESE CARDS ARE GONNA SUCK :lol2:

Dazz
03-06-03, 04:09 PM
Ooooh can i have 2 fps please ;) yeah well you get what you pay 4 lmao. I must say the Ti5200 is going to be a useless card, sure it is DX9 compatible but it won't be playable in the DX9 games :rolleyes:

StealthHawk
03-06-03, 05:13 PM
lol, hasn't Z compression been a feature since the gf3 of all cards but the gf4mx?

Paul
03-06-03, 05:42 PM
nVidia have to get a $99 NV30-based card out to retail. That involves big sacrifices, this being just one. I'd rather they did this, and kept DX9 functionality, rather than go the GF4MX route.

Captain Beige
03-06-03, 07:14 PM
Originally posted by Paul
nVidia have to get a $99 NV30-based card out to retail. That involves big sacrifices, this being just one. I'd rather they did this, and kept DX9 functionality, rather than go the GF4MX route.

um, but they aren't keeping DX9 functionality, just having bare-minimum "support" so they can stick it on the box and not be as embarrassed by their cards given the number of ATI DX9 cards available.

muzz
03-06-03, 07:19 PM
I'll tell ya one thing that PR BS from NV about having the top-bottom DX9 is just shiznit.....
Only about 50 people wordwide even have a 5800, and the lower variants are further behind....
GF4MX? gimme a break.....
Not real fellas.......
Next PR BS please.

Hanners
03-07-03, 05:25 AM
Originally posted by Paul
nVidia have to get a $99 NV30-based card out to retail. That involves big sacrifices, this being just one. I'd rather they did this, and kept DX9 functionality, rather than go the GF4MX route.

Obviously something has to be sacrificed, but cutting out bandwidth-saving measures in a modern graphics card is suicidal!

Looks what the lack of decent bandwidth-saving technology did for the Parhelia....

kyleb
03-07-03, 05:32 AM
Originally posted by Paul
nVidia have to get a $99 NV30-based card out to retail. That involves big sacrifices, this being just one. I'd rather they did this, and kept DX9 functionality, rather than go the GF4MX route.

honestly would much rather see gefoce4ti variants aimed at the 99$ market. you can find them cheaper than that now and if they just went a bit cheaper on the memory and cooling i am sure they could pull a 99$ retail price. granted that is just my openion but given the choice i think it would be a better option for nearly anyone.

Uttar
03-07-03, 06:40 AM
Originally posted by StealthHawk
lol, hasn't Z compression been a feature since the gf3 of all cards but the gf4mx?

Wrong.
Z Compression is in the GF3, GF4 Ti *and* GF4 MX

So, in bandwidth saving features, the NV34 is *less efficient* than a GF4 MX!
That's why this is so bad...


Uttar

Hellbinder
03-07-03, 04:00 PM
The real problem is that the Nv34 simply is NOT a true DX9 part. I mean seriously folks. It does many of its *so called* DX9 functions through software emmulation.

You wait till the reviews hit and see for yourself. That is, if Nvidia even allows them to use any DX9 based benchmarks. Like Shadermark, Rightmark and 3dmark03.

Vertex shader support = software

is just ONE of the areas it does not meet HARDWARE ACCELERATED DX9 Complience. This is nothing but a repeat of the GF4MX all over again. Only this is worse, co color compresseion, no Z compression etc etc..

its a Sham. Just call it a DX8+ card or something. But dont call it a DX9 card and claim you are the first to maket with "top-to-bottom" DX9 products..

The BS PR thing is realy getting irritating. Especilly when so many people are going to buy these cards thinking they are getting True DX9 support, and will be able to play DX9 games..

surfhurleydude
03-07-03, 04:10 PM
Oh well. It's still better than nothing, though. They should be faster than the GF 4 MX... At least its a step forward.

SurfMonkey
03-07-03, 04:12 PM
No hardware actually has to support any feature of DirectX directly in hardware to be compliant. As long as the feature set is supported through reference in the drivers the product is DX compliant.

Besides vertex shading is the lowest impact part of the feature set to emulate, if it were pixel shaders then you'd be talking about a whole different ball game.

And if you look at the area that the NV34 is targeted at, $79 to $100 region, then you'd realise that most of the machines they will be installed in would be severely cpu limited with anything more powerful. We're still talking celeron based rigs here!!

surfhurleydude
03-07-03, 04:14 PM
SurfMonkey, good point... I was kinda of scared of bringing that up myself, but now that it is up in the air....

Most machines that will have the NV34 installed in it will

1) Have MUCH CPU power to spare.
2) Not know what AA is.

So it really doesn't matter too much. It should be a fine card for the sector that it's being targeted at.

sebazve
03-07-03, 04:54 PM
Originally posted by SurfMonkey
No hardware actually has to support any feature of DirectX directly in hardware to be compliant. As long as the feature set is supported through reference in the drivers the product is DX compliant.

Besides vertex shading is the lowest impact part of the feature set to emulate, if it were pixel shaders then you'd be talking about a whole different ball game.

And if you look at the area that the NV34 is targeted at, $79 to $100 region, then you'd realise that most of the machines they will be installed in would be severely cpu limited with anything more powerful. We're still talking celeron based rigs here!!

and they are gonna be more limited since the cpu has to do things that the gpu should be doing. This cards sucks and need powerful cpus for emulation...cough xabre cough...in low end machines games sucks big time cause instead off loading the cpu (which is already not so fast) it has to emulate some stuff instead of only dedicate to calculate physics, AI, etc..

so at the end these cards needs a real powerful cpu to give respectable fps lol yeah who buy these cards and has a high end cpu...:angel2:

surfhurleydude
03-07-03, 04:56 PM
so at the end these cards needs a real powerful cpu to give respectable fps lol yeah who buy these cards and has a high end cpu...

Uhmmm...most likely 90% of all those that buy computers. Hell, my second computer, a Sony VAIO, is only 6 months old and came with a 2.2 ghz P4 and a TNT 2.

sebazve
03-07-03, 05:01 PM
Originally posted by surfhurleydude
SurfMonkey, good point... I was kinda of scared of bringing that up myself, but now that it is up in the air....

Most machines that will have the NV34 installed in it will

1) Have MUCH CPU power to spare.
2) Not know what AA is.

So it really doesn't matter too much. It should be a fine card for the sector that it's being targeted at.

IMO youre wrong people who buy these kind of card are in tight budget so they usually dont have fast cpus...
Eg all my friends that have a gf4mx have cpus around 800-1100mhz and (some even lower) which now is low end

surfhurleydude
03-07-03, 05:03 PM
IMO youre wrong people who buy these kind of card are in tight budget so they usually dont have fast cpus...
Eg all my friends that have a gf4mx have cpus around 800-1100mhz and (some even lower) which now is low end

What you are failing to understand is that these are the kind of cards that ARE BEING PUT INTO COMPUTERS WHEN YOU BUY THEM.

OEMs eat up these cards.

You go into Best Buy and see a few aisle of pre built computers. They can't run without a graphics chip... NV34 will be put in there.

And when the new nForce chipset comes out, the NV34 will be part of that integrated solution.

You can't buy computers these days with 800-1000 mhz CPUs. nVidia makes most of their money from selling their cards in new computers, those computers at Best Buy as I pointed out earlier.

Cotita
03-07-03, 05:59 PM
Besides Nvnews reports, I've seen no other source that states that the nv34 won't have z compression, so maybe there's hope.

According to Tom Hardware's impressions, it should be faster then the geforce4mx and close to the ti4200, I expect it to be on par with the geforce3 which is very good for a budget card.

There are already rumors that the nv31 will turn out to be a stripped down version of the nv30 and that probably it can be modified to become a nv30. Much like the radeon 9500 to 9700 mod.

This might be true, if we take into account the low yields that the nv30 has. The fact that lots of them can't run at 500mhz makes them likely canditates to run at 350mhz. I mean why throw away hundreds of chips if you can just make them run slower and disabling pipelines.

Lezmaka
03-07-03, 06:55 PM
Originally posted by Cotita
Besides Nvnews reports, I've seen no other source that states that the nv34 won't have z compression, so maybe there's hope.

According to Tom Hardware's impressions, it should be faster then the geforce4mx and close to the ti4200, I expect it to be on par with the geforce3 which is very good for a budget card.

There are already rumors that the nv31 will turn out to be a stripped down version of the nv30 and that probably it can be modified to become a nv30. Much like the radeon 9500 to 9700 mod.

This might be true, if we take into account the low yields that the nv30 has. The fact that lots of them can't run at 500mhz makes them likely canditates to run at 350mhz. I mean why throw away hundreds of chips if you can just make them run slower and disabling pipelines.

From what I've read, NV31 is about the same as a Ti4200 w/o AA/AF (maybe a little slower) but is much faster when using it.

NV31 is a stripped down version of NV30, but it is a different chip. You won't be able to mod an NV31 into an NV30. The chips are physically different, unlike the Radeon 9500/9700's. Things have actually been removed. The reason people were able to mod 9500's into 9700's is because they were all the same R300 chip. The only difference between the 9500/9700's is the position of a component on the chip packaging.

For me, as long as they overall outperform the chips they are meant to replace, then its good. As long as NV31 can outperform Ti4200's overall, and NV34's can outperform GF4 MX440's then there really shouldn't be anything to complain about.

StealthHawk
03-07-03, 07:15 PM
Originally posted by surfhurleydude
Oh well. It's still better than nothing, though. They should be faster than the GF 4 MX... At least its a step forward.

will it really be faster if it has NO Z compression?

You go into Best Buy and see a few aisle of pre built computers. They can't run without a graphics chip... NV34 will be put in there.

you're forgetting about integrated graphics.

StealthHawk
03-07-03, 07:21 PM
Originally posted by Uttar
Wrong.
Z Compression is in the GF3, GF4 Ti *and* GF4 MX

So, in bandwidth saving features, the NV34 is *less efficient* than a GF4 MX!
That's why this is so bad...


Uttar

first of all let me state that i can't believe nvidia would ditch the LMA2 that already exists in so many cards. i can believe they won't include color compression in NV34 though.

secondly, yep, i was wrong about the gf4mx not have Z compression :o everyone always harps on the gf4mx so much that i forgot it really had LMA2 optimizations :(

surfhurleydude
03-07-03, 07:29 PM
you're forgetting about integrated graphics.

No, I mentioned the nForce 3 (which is the best selling integrated graphics solution by a very large margin).

SurfMonkey
03-07-03, 07:39 PM
Originally posted by sebazve
and they are gonna be more limited since the cpu has to do things that the gpu should be doing. This cards sucks and need powerful cpus for emulation...cough xabre cough...in low end machines games sucks big time cause instead off loading the cpu (which is already not so fast) it has to emulate some stuff instead of only dedicate to calculate physics, AI, etc..

so at the end these cards needs a real powerful cpu to give respectable fps lol yeah who buy these cards and has a high end cpu...:angel2:

But the point is that a $100-$400 card would be wasted anyway. You would be CPU limited before your gfx card ever got chance to spread its wings. Also if you upgrade your processor you are going to see a bigger gain in performance than you would if you upgrade your gfx card.

And for all its faults it is still bringing DX9 to the mainsteam, before ATi, and that can only be a good thing. The quicker the mainstream becomes DX9 compliant the sooner we get DX9 level games.

On a side note, I've been checking this months mags that have GFFX reviews and not one of them has dissed it!! They've all said that if your budget is tight go for the 9700 but if you want the best get a GFFX. Somehow I find that a little disconcerting ;)

How far does ATi have to go to win the PR battle. Especially considering that the majority of upgraders will probably use magazine reviews as the basis for their choice.

Neova
03-07-03, 07:45 PM
I saw the announcement yesterday at GDC. I should have asked the Nvidia reps about color and z compression when I was there. I heard the mantra from the Nvidia Reps "Direct X 9 for 79" and many there (developers and hardware guys) mentioned the card will be good enough for apps with AA turned off.

As said by many folks already, these cards are for OEM market and cheaper low end machines. So what if they're not fully DX9 HW compliant? Getting the feature set out there to the mainstream, even if somewhat crippled will expand the user base for DX9 and motivate developers to migrate their engines to support DX9 features, which is a win for everyone, especially the hardcore crowd here with middle to highend cards.

How many developers do you know code strictly to one rendering path only with no fall back to software either via the API or their own code path?

I say wait and see how the 5200 Ultra FX competes with the GF4MX in today's game to get a real world estimate. We can talk paper benchmarks all day, but I think the core and memory speeds alone will give it nice boost in apps. Look at how playable games are with the GF4MX versus GF3 Ti200 at resolutions at and under 1024x768 with NO AA and medium FX settings on a fast processor (see how Dell equips it's systems) which is where most of the mainstream users play at anyways.