nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   Benchmarking And Overclocking (http://www.nvnews.net/vbulletin/forumdisplay.php?f=24)
-   -   Let's see the 3D Mark 2003 IQ issues differently, shall we? (http://www.nvnews.net/vbulletin/showthread.php?t=9302)

Uttar 03-30-03 05:34 AM

Let's see the 3D Mark 2003 IQ issues differently, shall we?
 
Hey everyone,

I've gotten quite annoyed by all of the noise going into the GeForce FX overbright issues in GT4: Mother Nature.

Because of this, people want to compare the GeForce FX 5x00 43.03 drivers with the Radeon 9x00.
Well let me say you one thing: that doesn't make sense.

I'm not a lame nVidia fanboy who is going to say you that 3DMark didn't program the thing correctly, that those IQ issues are unimportant, stuff like that. No, that's even worse.

What I'm annoyed by, however, is that people fail to realize that with the 43.45 drivers, there are no apparent IQ issues in GT1, GT2 and GT3 - only in GT4.
If there are issues in those games, I stand corrected. But so far, no one has shown me any.

Okay, so what's my point? My point is that any serious programmer, or at least any programmer who's a nVidia registered developer ( and there are a *lot* of them ) , would program GT2 & GT3 in a way more similar to what nVidia did by doing driver hacks.
But the same programmer wouldn't implement what nVidia did for GT4.

So what?

Well, the 3DMark 2003 score is calculated by multiplying the game scores by different numbers, then adding the results.
And the numbers are *public*

So why couldn't we calculate the GeForce FX score manually, if we truly wanted to be fair?

Considering the 3DMark 2003 results from http://www.darkcrow.co.kr/Review/Rev...?board_idx=146 and the method explained at http://www.beyond3d.com/articles/3dm....php?p=4#score

We've got, using 43.03 for GT4 and 43.45 for GT1, GT2 & GT3:

1306.7 +
1287.6 +
1361.19 +
595.98
= 4551

That's better than the 43.03 score of 3288, ain't it? But it's obviously worse than the 43.45 score of 5379.

Oh, sure, nVidia is still "cheating" for GT1, GT2 and GT3 considering that. But what proofs you they aren't going to cheat similarly in real games, too? If it doesn't result in IQ issues, I personally don't care.

In hope you people don't consider me as a faboy after this,


Uttar

Onde Pik 03-30-03 06:26 AM

My 3d knowlege is much more limited than most of you guys, but the "accepted truth" is that Nvidia forces FP16 in 3DMark03 right?

So arent we seing an example of FP16 being enough to correctly display some scenes and insufficient for the last scene?

Well if Nvidia start implementing this in games well I don't think I would care as long as it was impossible to see or feel the diffrence. That is, I wouldnt care if it wasnt for the fact that DX9 requires atleast 24bit. (It does right?)

Uttar 03-30-03 07:01 AM

The "accepted truth" is that nVidia at least forces FP16 in 3DMark03.
They probably do even more than that, including precompiling the shaders to make them more efficient than what their drivers could do with a normal shader, and maybe they even use INT12 in some areas ( although that's really not sure at all, and it's in fact quite unlikely )

DX9 requires at least FP24 for most operations. But programmers can say a specific operation needs at least FP16 - that's why I say any serious programmer probably wouldn't do it the way Futuremark did it.

IMO, the best thing nVidia could do is manually setting FP16 for things that they think FP16 would be sufficent on. It might not give as much of a performance boost as forcing FP16 everywhere, but it would certainly be better than using FP32 everywhere!

And no one could complain about lower image quality too, if nVidia did it right.


Uttar

StealthHawk 03-30-03 08:11 AM

Quote:

Originally posted by Uttar
And no one could complain about lower image quality too, if nVidia did it right.
that's the point, isn't it? they already got caught lowering IQ in GT4 with those special 42.xx drivers that have the same problem as the new 43.45 drivers. the IQ should have been fixed, nvidia has had ample time.

optimizing is fine, but optimizing and lower IQ is not :nono:

ChrisRay 03-30-03 08:21 AM

Well I do know Game Test 1, 2, 3 do not require FP 24 bit precision.

digitalwanderer 03-30-03 08:27 AM

One word:
 
WHQL.


It doesn't really matter who thinks what about 3dm2k3, I think it's just an indicator of their problems with WHQL certification.

Are they really gonna try and only put out "official but uncertified" drivers from now until NV35? :confused:

ChrisRay 03-30-03 08:53 AM

Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.

Uttar 03-30-03 08:58 AM

Quote:

Originally posted by ChrisRay
Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.

I sadly believe you're right...
That's why I'm proposing this approach.


Uttar

digitalwanderer 03-30-03 10:44 AM

Quote:

Originally posted by ChrisRay
Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.

So nVidia is just going to use 16-bit precision unless the application/game SPECIFICALLY calls for 24FP? :confused:

:thinker:

:mad:

<HRRRRM!>

Do I EVEN want to ask if you think that all DX9 apps will call for 24FP by default or do you think they'll query the driver for a preference? :rolleyes:


(Damn it, I just didn't want them to be able to wiggle their way out of this that easily!)

ChrisRay 03-30-03 12:58 PM

Quote:

Originally posted by digitalwanderer
So nVidia is just going to use 16-bit precision unless the application/game SPECIFICALLY calls for 24FP? :confused:

:thinker:

:mad:

<HRRRRM!>

Do I EVEN want to ask if you think that all DX9 apps will call for 24FP by default or do you think they'll query the driver for a preference? :rolleyes:


(Damn it, I just didn't want them to be able to wiggle their way out of this that easily!)


To be honest, I think it will depend upon the ap. It might be similar to vsync, "Aplication preference" And or always force 16 or always force 32 bit.

I really don't know.

I don't really know what to think of the scenerio right now, Do I believe 16 bit precision is enough for games of today and possibly tommorrow? ya I think 16 bit will be plenty,


Do I think this is good PR for Nvidia? No I do not, Nvidia is damned if they do and damned if they don't. I'm thankful that I'm in a situation where I can watch and see without tramatically being affected by whats to come.

in the end tho I think most vendors will opt to use 16 bit on both ATI and Nvidia cards, I have two reasons to believe this. Performance and negligable IQ increases, And Nvidia's influence in the gaming division right now.


Quote:

I sadly believe you're right...
That's why I'm proposing this approach
Not quite sure I follow here? ;)

What exactly are you proposing?

Uttar 03-30-03 01:27 PM

Quote:

Originally posted by ChrisRay
Not quite sure I follow here? ;)

What exactly are you proposing?

Manually composing the 3DMark 2003 score to get something more fair than what we're gonna get using only one driver set.
That's what the original post explains :P


Uttar

Captain Beige 03-30-03 01:35 PM

Quote:

Originally posted by ChrisRay
To be honest, I think it will depend upon the ap. It might be similar to vsync, "Aplication preference" And or always force 16 or always force 32 bit.

I really don't know.

I don't really know what to think of the scenerio right now, Do I believe 16 bit precision is enough for games of today and possibly tommorrow? ya I think 16 bit will be plenty,


Do I think this is good PR for Nvidia? No I do not, Nvidia is damned if they do and damned if they don't. I'm thankful that I'm in a situation where I can watch and see without tramatically being affected by whats to come.

in the end tho I think most vendors will opt to use 16 bit on both ATI and Nvidia cards, I have two reasons to believe this. Performance and negligable IQ increases, And Nvidia's influence in the gaming division right now.




Not quite sure I follow here? ;)

What exactly are you proposing?

ATI cards don't support FP16, only FP24, since FP24 is part of the DX9 specification but FP16 is not and is therefore useless for a true DX9 card. this is not like vsync. vsync is an option not part of a standard. nvidia cards using FP16 unless specifically asked for FP32 is ridiculous.

it would be like a company claiming to have an equal oportunities policy but discriminating against people unless you specifically told them not to be prejudiced against every possible kind of lifestyle and if you accidentally left anyone out they'd bully them until they accpeted lower pay, and then saying it was okay because you didn't say you wanted them to be treated fairly and boasting about how great they are at cutting costs.


All times are GMT -5. The time now is 05:25 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.