Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-13-03, 11:00 AM   #13
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by Sazar
iq with 9800pro/cat 3.7



iq with 5600u/det 45.33



iq with 5600u/det 51.75



if the images are too large mods please delete tags and leave img links...

cheers

I honestly dont see a huge difference here. Not one that would distract me during gameplay anyways. Anyway someone could flash these together? It certainly makes seeing the difference in this easier


*edit* ok I do see a slight difference in ground texture from the ATI card to Nvidia card, But not seeing a huge difference between the 2 det versions on that scene.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members

Last edited by ChrisRay; 09-13-03 at 11:04 AM.
ChrisRay is offline   Reply With Quote
Old 09-13-03, 11:02 AM   #14
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Cool Heh-heh-heh.

Quote:
Zardon at DriverHeaven:
Now, im sure most of you have read Gabes recent comments regarding the detonator 51.75s, and Nvidia's offical response but I really do have to say, having seen this first hand it confirms to both myself and Veridian that the new detonators are far from a high quality IQ set. Alot of negative publicity is currently surrounding Nvidia, and here at driverheaven we like to remain as impartial and open minded as we possibly can, but after reading all the articles recently such as here coming from good sources and experiencing this ourselves first hand, I can no longer recommend an nvidia card to anyone. Ill be speaking with nvidia about this over the coming days and if I can make anything public I will.
(I won't say it, honest. )
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 09-13-03, 11:05 AM   #15
Bopple
Registered User
 
Join Date: Mar 2003
Posts: 208
Default

Quote:
Originally posted by ChrisRay
I honestly dont see a huge difference here. Not one that would distract me during gameplay anyways. Anyway someone could flash these together? It certainly makes seeing the difference in this easier
No huge difference i agree. But what is the point of Highest Quality then?
Every company may adopt the same strategy. Just delicate cheatings.
__________________
Handsome fighter never loses battle.
Bopple is offline   Reply With Quote
Old 09-13-03, 11:06 AM   #16
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

Quote:
Originally posted by ChrisRay
I honestly dont see a huge difference here. Not one that would distract me during gameplay anyways. Anyway someone could flash these together? It certainly makes seeing the difference in this easier
well I am sure you notice the difference between the ati and nvidia cards rendering... just look @ the foreground...

further.. from the 4x series to the 5x series dets... notice the more blurry background and parts of the foreground as well...

there is something going on for sure.. its like the 3dmark03 sky issue people had with the det's not rendering properly...

DH has better pictures and both of the testers have stated clearly that the IQ difference is noticeable..

also to note... the IQ of the det 51.75 drivers may not be representative if what is claimed is true about nvidia's drivers detecting screen grabs and kicking up the quality on the shot compared to ingame viewing...
Sazar is offline   Reply With Quote
Old 09-13-03, 11:09 AM   #17
Joe DeFuria
Registered User
 
Join Date: Oct 2002
Posts: 236
Default Re: Heh-heh-heh.

Quote:
Originally posted by digitalwanderer
(I won't say it, honest. )
Actually, you can't say how big the difference is until you see it in motion. Things like high-lights due to dynamic range (which appears might be going on here...not too dissimilar to Half-Life2 high dynamic range effects), can have a drastic impact on quality in motion.

both 3dgpu and Driverheaven commented on the obvious difference in image quality.
Joe DeFuria is offline   Reply With Quote
Old 09-13-03, 11:10 AM   #18
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Oh no, we're not going to have the 'If you can't see it, then it's okay' argument again are we?

This is a benchmark!!! Lowering image quality for framerate gains in a benchmark is a big, big no-no.

Makes you wonder what happened to these, doesn't it:



__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 09-13-03, 11:15 AM   #19
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default Re: Re: Heh-heh-heh.

Quote:
Originally posted by Joe DeFuria
Actually, you can't say how big the difference is until you see it in motion. Things like high-lights due to dynamic range (which appears might be going on here...not too dissimilar to Half-Life2 high dynamic range effects), can have a drastic impact on quality in motion.

both 3dgpu and Driverheaven commented on the obvious difference in image quality.
I wish they show video captures with the differences...

maybe then it will be easier to see... though it would hammer their server I spose
Sazar is offline   Reply With Quote
Old 09-13-03, 11:18 AM   #20
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Remember 3dmark? Darker sky, darker water, darker textures? They must be doing the exact same thing they did then with Aquamark
reever2 is offline   Reply With Quote

Old 09-13-03, 11:18 AM   #21
particleman
Registered User
 
Join Date: Aug 2002
Posts: 370
Default

Quote:
Originally posted by ChrisRay
I honestly dont see a huge difference here. Not one that would distract me during gameplay anyways. Anyway someone could flash these together? It certainly makes seeing the difference in this easier


*edit* ok I do see a slight difference in ground texture from the ATI card to Nvidia card, But not seeing a huge difference between the 2 det versions on that scene.
I couldn't see much of a difference initially either. But then I put it in a slide show, and saw the differences in lighting, and the small differences in the textures. The lighting is the biggest thing IMO, since that would probably be noticed even more during gameplay than static screenshots. It definitely looks like the GFFX is skimping on some lighting effects. The ppl at 3DGPU have seen it in motion and if they claim that they can see a big difference, that means a lot to me, since 3DGPU was/is a nVidia fansite, while Driverheaven has always had closer connections to ATi. I have no issue with nVidia trying to find the best combo of IQ and performance, but they need to give users the option to disable it if they don't like the IQ loss, and they definitely shouldn't be doing it in a benchmark. This goes against the very rules nVidia outlined as cheating a while ago.
particleman is offline   Reply With Quote
Old 09-13-03, 11:25 AM   #22
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by particleman
I couldn't see much of a difference initially either. But then I put it in a slide show, and saw the differences in lighting, and the small differences in the textures. The lighting is the biggest thing IMO, since that would probably be noticed even more during gameplay than static screenshots. It definitely looks like the GFFX is skimping on some lighting effects. The ppl at 3DGPU have seen it in motion and if they claim that they can see a big difference, that means a lot to me, since 3DGPU was/is a nVidia fansite, while Driverheaven has always had closer connections to ATi. I have no issue with nVidia trying to find the best combo of IQ and performance, but they need to give users the option to disable it if they don't like the IQ loss, and they definitely shouldn't be doing it in a benchmark. This goes against the very rules nVidia outlined as cheating a while ago.
Hey I agree. I'm not even saying. if you cant see it. It Doesnt make a difference. I'm just looking for the differences.

I really need to see it in motion. Perhaps my eyes are just not what they used to be I see very little difference between the Det versions and a rather drastic version between the Nvidia/ATI versions.

I wish I had a Geforce FX card to test it with then I could tell you.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 09-13-03, 11:26 AM   #23
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Exclamation Wait a second....

Candyman, CANDYMAN!!!!

(The Dig runs across the forum and grabs up Candyman in a HUGE warm fuzzy hug, spinning him around repeatedly in his exuberance!)

By the gods and all that I hold holy it has been a while and it is DAMNED good to see ya friend! How is life treating you?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 09-13-03, 11:31 AM   #24
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by ChrisRay
I wish I had a Geforce FX card to test it with then I could tell you.
I have the card, I just need the benchmark!
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 07:52 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.