Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-18-03, 03:58 PM   #61
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
insults are never positive but consider that you are disregarding logical statements for hyposthesis that are more difficult to comprehend than things that should be clearly visible
There is no justification for a "Harry Potter" comment other than someone trying to make himself feel better.

Anyway, to say it again: these Detonator FX 44.03 drivers do seem to increase both image quality and performance for FX cards almost across the board. So there is no good reason for any FX owner to not use them.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 04:00 PM   #62
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
it is misleading and unethical..
You are entitled to your opinion, no doubt. However, considering that image quality and performance were improved on 3dmark03 (and seemingly most other games), if I were an FX owner I wouldn't be complaining
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 04:03 PM   #63
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

If I were an nv35 owner I wouldn't be complaining (except about 3dmark) but if i were an nv30 owner I would.
jjjayb is offline   Reply With Quote
Old 05-18-03, 04:03 PM   #64
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
They did not remove Non-Whql scores until AFTER Nvidia got caught lowering precision to achieve a higher score.
True, but NVIDIA didn't have a WHQL-certified driver for a long long time even before this, correct? They had non-WHQL drivers that seemed to work properly before coming out with the 3dmark-optimized detonators.

Also, the NV30 is a bit loud in some setups true. The performance with the new drivers looks really good though, especially for people who play Splinter Cell

Last edited by jimmyjames123; 05-18-03 at 04:07 PM.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 04:04 PM   #65
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
True, but NVIDIA didn't have a WHQL-certified driver for a long long time even before this, correct? They had non-WHQL drivers that seemed to work properly before coming out with the 3dmark-optimized detonators.
Seemed to.
jjjayb is offline   Reply With Quote
Old 05-18-03, 04:06 PM   #66
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

I mean, some of the older non-WHQL detonators didn't have corruption of image quality, right?
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 04:25 PM   #67
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Talking Hey, you're right!

Quote:
Originally posted by Ady
Get over it already. I know that you're not going to change my mind no matter what is said. It's a cheat, it's plain and simple.

It's also rather obvious that..

1. Ati fans will think it's a cheat. Even if they don't really understand or read up on it.
2. Nvidia fans will think it's NOT a cheat. Even if they don't really understand or read up on it.
3. People that don't care either way will still judge this as a cheat.
Thanks Ady, you really cheered me up about this whole mess by putting it in it's proper perspetive.

I ain't gonna change any nVidia fanboys minds here, and the ATi fanboys it's preaching to the choir...but that all important "3. People that don't care either way will still judge this as a cheat" is the one that I spaced.

Here, let me quote it again...it fills me all up with happy for some odd reason:

Quote:
3. People that don't care either way will still judge this as a cheat.
The fanboys are going to be loyal to nVidia no matter what, but the more open minded "middle-of-the-road-the-brand-ain't-as-important-as-performance" people will know the truth.

Thanks, I feel better.


Quote:
Originally posted by Sazar
just for the record

if john does say something... I would regard it with great consideration based on his reputation and posts @ other forums...

insults are never positive but consider that you are disregarding logical statements for hyposthesis that are more difficult to comprehend than things that should be clearly visible... ie... no clipping panes == no-no considering this particular benchmark... and the way timedemo's work...
Seconded.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-18-03, 04:31 PM   #68
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default Re: My thoughts on "Optimization"

Quote:
Originally posted by jimmyjames123
I have been reading about this 3dmark03 "Rendering Issue".

For one, I think the journalism at Extremetech and Beyond3d was a bit odd. Never do they mention the following relevant points:

1) Image quality was NOT compromised with the new FX 44.03 driver. In fact, most people have noted improvements in Image Quality, even in 3dmark03.

2) All images in 3dmark03 that we actually SEE are rendered correctly.

These points might sound obvious to some, but to someone casually reading or to someone who isn't very familiar with graphics card terminology, these points wouldn't be so obvious.

Now, ask yourself this question:

If we can get smoother and faster performance, without any loss of image quality, isn't that a good thing?

If I had an FX NVIDIA card, I would be happy that NVIDIA could "optimize" for 3dmark03 without any loss of image quality and without affecting any images that we actually see on screen

It is no secret that both NVIDIA and ATI "optimize" their drivers for a benchmark like 3dmark03. If they can do it without compromising image quality, all the better (IMO).

Some other points that are worth noting:

1) NVIDIA does not have authorized access to the developer's build of 3dmark, while ATI (and Extremetech, Beyond3d, etc) does, where they can roam around anywhere even off-center of the actual image displayed.

2) Futuremark strangely only allows WHQL certified drivers for published online results for their 3dmark program, but they allow overclocked graphics cards and cpu's (and note how Futuremark buries the Detonator FX driver underneath the others on their homepage annoucements, not even mentioning about how it is WHQL certified for GeForce FX cards).

3) [H]OCP talked with ATI privately about the quake/quack driver cheat issue for more than a month before writing their article. ATI repeatedly denied cheating, and [H]OCP released their article after they proved them wrong (showing that image quality was compromised on Quake 3 while enhancing performance).

4) The NVIDIA FX Detonator driver increased both performance and IQ noticeably for a variety of benchmarks and games on the FX graphics cards, not simply 3dmark03.

5) The Doom3 benchmark's used in the latest tests were chosen by ID and were not seen by NVIDIA in advance.

All in all, it boils down to what one considers an "optimization" vs. a "cheat". If a graphics card company can make performance smoother and faster without compromising image quality and without affecting what we actually see, then I would say this is a good thing (especially considering that NVIDIA and ATI both optimize for benchmarks in the first place).

exactly.. well said.

I would like that [NVNEWS] post (jimmyjames123) summarry of the current situation..in the front page of this site ,it is very important that every gamer outhere knows about those facts ,and let them judge if what Nvidia have done is ok for them or not. because i see frequently post here and there of people that doesnt have a clue of how this "cheat" or "optimization" is any diferent to what ATI and NVIDIA does in games to get best performance without the decrease in IQ.

Last edited by Nv40; 05-18-03 at 04:55 PM.
Nv40 is offline   Reply With Quote

Old 05-18-03, 04:35 PM   #69
Clockwork
I was cured all right...
 
Clockwork's Avatar
 
Join Date: Jul 2002
Location: The Korova Milk Bar
Posts: 115
Default

Nv40, it's pretty sad when fellow nVidia enthusiasts think you're an nVidiot.

I don't think anyone here takes you seriously.
__________________
| AMD Athlon 64 3500+ | 1GB Corsair XMS Extreme Memory PC3200 DDR | GIGABYTE GA-K8NSNXP-939 nForce3 Ultra | WD 120GB SATA | BFG 6800GT OC w/ Zalman VF700cu | Pioneer DVR-A07XLA 8x DVD+-R/RW | Aopen 1640Pro-A 16x DVD | Cooler Master Cavalier 1 CAV-T01-WWA case | Ultra X-Connect 500watt psu | Windows XP Professional w/ SP2 |Samsung 193p+ 19" LCD

Last edited by Clockwork; 05-18-03 at 04:52 PM.
Clockwork is offline   Reply With Quote
Old 05-18-03, 04:43 PM   #70
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by John Reynolds
The visuals are not what count, the final score is what counts.


.................................................. ...............


That, my friend, is one of the worst bifurcations from logic I have ever seen argued.

exactly ,one of the most questionable logic , i have ever heard, see how you answer to yourself..

Last edited by Nv40; 05-18-03 at 04:53 PM.
Nv40 is offline   Reply With Quote
Old 05-18-03, 04:44 PM   #71
Filburt
Jeimei Frugiunglagią
 
Join Date: Jan 2003
Posts: 105
Send a message via ICQ to Filburt Send a message via AIM to Filburt
Default

The point of the benchmark is to see how a card renders handles ALL of the data of the scene. Not simply the part that you see. You mistake the point of the benchmark entirely. The visuals are simply eye candy to give you a subjective experience whilst the benchmark attempt to calculate an objective score. Cutting out the majority of the data to be passed to the card in effect makes the obtained value irrelevant since it merely reflects how well the card can render a portion of the data rather than the entire scene.

Apparently you don't understand that the intent of the benchmark is to plug through all of the data and let the card's routines sort out how to handle it and then render it. The reason it's always on the same "rail" is to give a control factor in the test. Inserting static clip planes is NOT the same as other methods of hidden surface removal because it happens BEFORE the HSR comes into play, effectively reducing the actual workload the card sees. That isn't optimizing the card, that's modifying the benchmark itself. Thus, the benchmark the FX5900 and the R9800 are two wildly different benchmarks, and given the R9800 is running the benchmark as it was intended, one can conclude scores obtained by the FX5900 are essentially meaningless except when comparing among other FX5900s. Furthermore these "optimizations" gain the user higher scores only in benchmarks because nVidia can employ these static clip planes only in deterministic benchmarks. Thus the scores one receives in a benchmark with the FX5900 is incongruous to the actual performance one will achieve in gameplay.

So...*ahem*...that isn't cheating?
Filburt is offline   Reply With Quote
Old 05-18-03, 04:49 PM   #72
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by jjjayb
If I were an nv35 owner I wouldn't be complaining (except about 3dmark) but if i were an nv30 owner I would.

Why? They improved AF quality 10 fold and and lost little to no performance, There is way more to these drivers than just 3dmark.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bored, impressed, and giddy: Our final thoughts on E3 2012 (with photos) News Archived News Items 0 06-13-12 06:00 AM
Thoughts from console owners on NVIDIA's GEFORCE GRID MikeC Console World 11 05-27-12 08:43 AM
Looking for a good 21"/22" Monitor...any thoughts? Guuts General Hardware 13 09-22-02 11:04 AM
Thoughts on the command line as an interface. lunix Microsoft Windows XP And Vista 10 09-12-02 08:44 PM
GTA Thoughts? Typedef Enum Gaming Central 5 09-03-02 04:51 PM

All times are GMT -5. The time now is 08:56 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.