Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-30-03, 06:34 AM   #1
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default Let's see the 3D Mark 2003 IQ issues differently, shall we?

Hey everyone,

I've gotten quite annoyed by all of the noise going into the GeForce FX overbright issues in GT4: Mother Nature.

Because of this, people want to compare the GeForce FX 5x00 43.03 drivers with the Radeon 9x00.
Well let me say you one thing: that doesn't make sense.

I'm not a lame nVidia fanboy who is going to say you that 3DMark didn't program the thing correctly, that those IQ issues are unimportant, stuff like that. No, that's even worse.

What I'm annoyed by, however, is that people fail to realize that with the 43.45 drivers, there are no apparent IQ issues in GT1, GT2 and GT3 - only in GT4.
If there are issues in those games, I stand corrected. But so far, no one has shown me any.

Okay, so what's my point? My point is that any serious programmer, or at least any programmer who's a nVidia registered developer ( and there are a *lot* of them ) , would program GT2 & GT3 in a way more similar to what nVidia did by doing driver hacks.
But the same programmer wouldn't implement what nVidia did for GT4.

So what?

Well, the 3DMark 2003 score is calculated by multiplying the game scores by different numbers, then adding the results.
And the numbers are *public*

So why couldn't we calculate the GeForce FX score manually, if we truly wanted to be fair?

Considering the 3DMark 2003 results from http://www.darkcrow.co.kr/Review/Rev...?board_idx=146 and the method explained at http://www.beyond3d.com/articles/3dm....php?p=4#score

We've got, using 43.03 for GT4 and 43.45 for GT1, GT2 & GT3:

1306.7 +
1287.6 +
1361.19 +
595.98
= 4551

That's better than the 43.03 score of 3288, ain't it? But it's obviously worse than the 43.45 score of 5379.

Oh, sure, nVidia is still "cheating" for GT1, GT2 and GT3 considering that. But what proofs you they aren't going to cheat similarly in real games, too? If it doesn't result in IQ issues, I personally don't care.

In hope you people don't consider me as a faboy after this,


Uttar
Uttar is offline   Reply With Quote
Old 03-30-03, 07:26 AM   #2
Onde Pik
Thrakhath nar Kiranka
 
Onde Pik's Avatar
 
Join Date: Aug 2002
Location: Kilrah
Posts: 92
Default

My 3d knowlege is much more limited than most of you guys, but the "accepted truth" is that Nvidia forces FP16 in 3DMark03 right?

So arent we seing an example of FP16 being enough to correctly display some scenes and insufficient for the last scene?

Well if Nvidia start implementing this in games well I don't think I would care as long as it was impossible to see or feel the diffrence. That is, I wouldnt care if it wasnt for the fact that DX9 requires atleast 24bit. (It does right?)
__________________
Ek’rah skabak erg Thrak’Kilrah maks Ragnith
Onde Pik is offline   Reply With Quote
Old 03-30-03, 08:01 AM   #3
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

The "accepted truth" is that nVidia at least forces FP16 in 3DMark03.
They probably do even more than that, including precompiling the shaders to make them more efficient than what their drivers could do with a normal shader, and maybe they even use INT12 in some areas ( although that's really not sure at all, and it's in fact quite unlikely )

DX9 requires at least FP24 for most operations. But programmers can say a specific operation needs at least FP16 - that's why I say any serious programmer probably wouldn't do it the way Futuremark did it.

IMO, the best thing nVidia could do is manually setting FP16 for things that they think FP16 would be sufficent on. It might not give as much of a performance boost as forcing FP16 everywhere, but it would certainly be better than using FP32 everywhere!

And no one could complain about lower image quality too, if nVidia did it right.


Uttar
Uttar is offline   Reply With Quote
Old 03-30-03, 09:11 AM   #4
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Uttar
And no one could complain about lower image quality too, if nVidia did it right.
that's the point, isn't it? they already got caught lowering IQ in GT4 with those special 42.xx drivers that have the same problem as the new 43.45 drivers. the IQ should have been fixed, nvidia has had ample time.

optimizing is fine, but optimizing and lower IQ is not
  Reply With Quote
Old 03-30-03, 09:21 AM   #5
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Well I do know Game Test 1, 2, 3 do not require FP 24 bit precision.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-30-03, 09:27 AM   #6
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default One word:

WHQL.


It doesn't really matter who thinks what about 3dm2k3, I think it's just an indicator of their problems with WHQL certification.

Are they really gonna try and only put out "official but uncertified" drivers from now until NV35?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 03-30-03, 09:53 AM   #7
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-30-03, 09:58 AM   #8
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by ChrisRay
Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.
I sadly believe you're right...
That's why I'm proposing this approach.


Uttar
Uttar is offline   Reply With Quote

Old 03-30-03, 11:44 AM   #9
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by ChrisRay
Digital. Microsoft don't care about futuremark, They will get certified drivers, Futuremark be damned,

Because Nvidia is not forcing 16 bit precision unless specifically requested.
So nVidia is just going to use 16-bit precision unless the application/game SPECIFICALLY calls for 24FP?





<HRRRRM!>

Do I EVEN want to ask if you think that all DX9 apps will call for 24FP by default or do you think they'll query the driver for a preference?


(Damn it, I just didn't want them to be able to wiggle their way out of this that easily!)
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 03-30-03, 01:58 PM   #10
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by digitalwanderer
So nVidia is just going to use 16-bit precision unless the application/game SPECIFICALLY calls for 24FP?





<HRRRRM!>

Do I EVEN want to ask if you think that all DX9 apps will call for 24FP by default or do you think they'll query the driver for a preference?


(Damn it, I just didn't want them to be able to wiggle their way out of this that easily!)

To be honest, I think it will depend upon the ap. It might be similar to vsync, "Aplication preference" And or always force 16 or always force 32 bit.

I really don't know.

I don't really know what to think of the scenerio right now, Do I believe 16 bit precision is enough for games of today and possibly tommorrow? ya I think 16 bit will be plenty,


Do I think this is good PR for Nvidia? No I do not, Nvidia is damned if they do and damned if they don't. I'm thankful that I'm in a situation where I can watch and see without tramatically being affected by whats to come.

in the end tho I think most vendors will opt to use 16 bit on both ATI and Nvidia cards, I have two reasons to believe this. Performance and negligable IQ increases, And Nvidia's influence in the gaming division right now.


Quote:
I sadly believe you're right...
That's why I'm proposing this approach
Not quite sure I follow here?

What exactly are you proposing?
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 03-30-03, 02:27 PM   #11
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by ChrisRay
Not quite sure I follow here?

What exactly are you proposing?
Manually composing the 3DMark 2003 score to get something more fair than what we're gonna get using only one driver set.
That's what the original post explains :P


Uttar
Uttar is offline   Reply With Quote
Old 03-30-03, 02:35 PM   #12
Captain Beige
 
Join Date: Feb 2003
Posts: 59
Default

Quote:
Originally posted by ChrisRay
To be honest, I think it will depend upon the ap. It might be similar to vsync, "Aplication preference" And or always force 16 or always force 32 bit.

I really don't know.

I don't really know what to think of the scenerio right now, Do I believe 16 bit precision is enough for games of today and possibly tommorrow? ya I think 16 bit will be plenty,


Do I think this is good PR for Nvidia? No I do not, Nvidia is damned if they do and damned if they don't. I'm thankful that I'm in a situation where I can watch and see without tramatically being affected by whats to come.

in the end tho I think most vendors will opt to use 16 bit on both ATI and Nvidia cards, I have two reasons to believe this. Performance and negligable IQ increases, And Nvidia's influence in the gaming division right now.




Not quite sure I follow here?

What exactly are you proposing?
ATI cards don't support FP16, only FP24, since FP24 is part of the DX9 specification but FP16 is not and is therefore useless for a true DX9 card. this is not like vsync. vsync is an option not part of a standard. nvidia cards using FP16 unless specifically asked for FP32 is ridiculous.

it would be like a company claiming to have an equal oportunities policy but discriminating against people unless you specifically told them not to be prejudiced against every possible kind of lifestyle and if you accidentally left anyone out they'd bully them until they accpeted lower pay, and then saying it was okay because you didn't say you wanted them to be treated fairly and boasting about how great they are at cutting costs.
__________________
"If you want a picture of the future, imagine a fan blowing on a human face - forever." ([I]GeForce Orwell, 2004[/I])

Last edited by Captain Beige; 03-30-03 at 02:41 PM.
Captain Beige is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nforce AGP & unreal 2003 nichos NVIDIA Linux 1 10-18-02 06:21 PM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 11:18 PM
NV30 not shipping until Feb. 2003? sbp Rumor Mill 40 09-17-02 11:41 PM

All times are GMT -5. The time now is 12:52 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.