Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 7, 8, And 9 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-24-03, 06:26 PM   #61
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: No, there can be only one.

Quote:
Originally posted by digitalwanderer
You are in error, I don't get what you're trying to prove anymore other than "nVidia and [T] aren't as bad as all you guys say" when we all know damned well they are.
If I'm in error, I'm not sure you understand my argument, which basically consists of:

* If an optimization is not noticable during gameplay, then it is welcomed, even if studying screenshots reveals minor IQ differences.

* HardOCP is a reputable pro site that was asked to run a controlled A/B test in IQ between two cards, did so, and found little difference in screenshots, and no difference during actual gameplay - it is also the most comprehensive controlled A/B comparison on any site I know of between these two cards and this game on the net. Since a pro site that was looking for differences could find little to none, I am choosing to not worry about the optimizations for my card, and will worry more about winning the actual game.

* I prefer Nvidia making the performance/IQ/compatibility settings on an application-specific level to an extent with my Nvidia hardware, as opposed to manually changing my settings and experimenting to see what gives the best results for each particular game. Makes gaming quicker and less of a hassle.

How any points of these points can be declared right or wrong is beyond me.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64

Last edited by Ruined; 07-24-03 at 06:39 PM.
Ruined is offline   Reply With Quote
Old 07-24-03, 06:28 PM   #62
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default Re: Re: No, there can be only one.

Quote:
Originally posted by Ruined

How either of these points can be declared right or wrong is beyond me.
So that means nobody can question them and are right by default because you and [H] say they are?
reever2 is offline   Reply With Quote
Old 07-24-03, 06:33 PM   #63
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: Re: Re: No, there can be only one.

Quote:
Originally posted by reever2
So that means nobody can question them and are right by default because you and [H] say they are?
Note that I said 'right or wrong', not just 'wrong'.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-24-03, 06:34 PM   #64
walkndude
Guest
 
Posts: n/a
Default

Thats obviously not what he's saying reever...

sheesh, it's just the opposite
  Reply With Quote
Old 07-24-03, 06:41 PM   #65
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default Re: Re: No, there can be only one.

Quote:
Originally posted by Ruined
* If an optimization is not noticable during gameplay, then it is welcomed, even if studying stills reveals IQ flaws.
WRONG! Well, not really...but totally off-topic. This is about comparing two cards fairly, which nVidia bypasses no matter how little change there is. If there is ANY change to the output it invalidates the test...'specially when it would be pretty easy to fairly compare the two pieces of hardware no matter how complicated Seig Kyle makes it sound.

Enable trilinear on the NV35 and re-bench it, simple.

Quote:
* HardOCP is a reputable pro site that was asked to run a controlled A/B test in IQ between two cards, did so, and found little difference in screenshots, and no difference during actual gameplay - it is also the most comprehensive controlled A/B comparison on any site I know of between these two cards on the net. Since a pro site that was looking for differences could find little to none, I am choosing to not worry about the optimizations for my card, and will worry more about winning the actual game.
WRONG! They found little difference during gameplay, and Kyle's & Bent's words don't really have much weight left in 'em when they say things like, "Trust us, you can't tell the difference 'tween almost-trilinear and trilinear so it's okey-dokey to compare them.

Quote:
* I prefer Nvidia making the performance/IQ/compatibility settings on an application-specific level to an extent with my Nvidia hardware, as opposed to manually changing my settings and experimenting to see what gives the best results for each particular game. Makes gaming quicker and less of a hassle.
WRONG! Ok, your preference is your business...but it's wrong that they don't leave the control of it in the hands of the user when they're SAYING they are and their drivers are over-riding the users preference.

Quote:
*How any points of these points can be declared right or wrong is beyond me.
Easy, all wrong.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 07-24-03, 06:44 PM   #66
walkndude
Guest
 
Posts: n/a
Default

do you realize why your wasting your time yet ruined.
  Reply With Quote
Old 07-24-03, 06:52 PM   #67
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

Quote:
Originally posted by walkndude
do you realize why your wasting your time yet ruined.


Not trying to convince the world of anything here, but that doesn't mean I won't post my viewpoints and argue them.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-24-03, 07:02 PM   #68
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Ruined


Not trying to convince the world of anything here, but that doesn't mean I won't post my viewpoints and argue them.
You keep setting them up, I'll keep knocking 'em home.

I got time, and it's too bloody hot in me house to really game yet. (I gotta wait an hour for the cool breeze to kick in...I HATE summer!)
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote

Old 07-24-03, 07:09 PM   #69
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by digitalwanderer
No, this issue is danced around and FUDed up by aapo there.
I beg you pardon, what exactly I'm FUDing around there?

Do you claim that a) the size of the non-partial-precision DX9 pixel shader FP data type is not 32 bits b) It's stupid to assume that nVidia and ATi aren't going to implement combined pixel and vertex shaders at FP32 precision in future chipsets c) there is no 16 bit partial precision data type in DX9 pixel shaders d) pixel shader calculations with different precisions aren't equally fast on NV3X when only one temp register per fragment program is used e) something else?



FFS, Why do you then think nVidia put FP32 support for their chip! They obviously wanted to minimize the architectural differences between this and next generations, and designed a FP pipeline that could already deal with 32 bit data in both PS and VS. Unfortunately, FP PS is dog slow with all NV3X but that's besides the point if discussing architecture and features.



32 bit floats with FP24 precision are not here to stay. Only a fanATic would claim otherwise. Heck, 24 is not even an even number after >> 3.

__________________
no sig.
aapo is offline   Reply With Quote
Old 07-24-03, 07:23 PM   #70
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by aapo
I beg you pardon, what exactly I'm FUDing around there?

Do you claim that a) the size of the non-partial-precision DX9 pixel shader FP data type is not 32 bits b) It's stupid to assume that nVidia and ATi aren't going to implement combined pixel and vertex shaders at FP32 precision in future chipsets c) there is no 16 bit partial precision data type in DX9 pixel shaders d) pixel shader calculations with different precisions aren't equally fast on NV3X when only one temp register per fragment program is used e) something else?



FFS, Why do you then think nVidia put FP32 support for their chip! They obviously wanted to minimize the architectural differences between this and next generations, and designed a FP pipeline that could already deal with 32 bit data in both PS and VS. Unfortunately, FP PS is dog slow with all NV3X but that's besides the point if discussing architecture and features.



32 bit floats with FP24 precision are not here to stay. Only a fanATic would claim otherwise. Heck, 24 is not even an even number after >> 3.

Hey, that's all well and good and I ain't arguing with you over it...my point is it's a bit irrelavent and moot to the discussion at hand.

And the fact still remains that the minimum required for DX9 is FP24 which nVidia chose not to do in favor of their FP16/FP32 arrangement. They took a gamble that they could change the DX9 specs to their favor, and they lost.

(Yes, I know all about the partial-precision clause that allows for FP16 calls...but that was fudged in later. )

But it ain't really applicable to the fact that the 44.03 drivers can't do AF in UT2k3, is it?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 07-24-03, 07:49 PM   #71
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by digitalwanderer
And the fact still remains that the minimum required for DX9 is FP24 which nVidia chose not to do in favor of their FP16/FP32 arrangement.
Yes, but as I've said before, the data type is always 32 bits even in ATi's FP24 mode. The MS reference rasterizer uses 32 bits with FP32 precision, so nVidias FP32 is goodness and that's why they implemented it. The only evil thing is FP16 (now that FX12 is maybe gone), and it's used only because the nVidia drivers force it in order to compete with ATi. There's nothing wrong with the NV3X hardware besides the fact it's slower than ATi's R3X0. nVidias problem are mainly in their PR and DCMM teams (Driver-Cheating Monkey-Men).

Quote:
But it ain't really applicable to the fact that the 44.03 drivers can't do AF in UT2k3, is it?
Yup, but my original post quoted here wasn't in this thread. It was a discussion about "can nvidia ever remove these and other 'optimizations' from their drivers even if they wanted to?", which obviosly was connected to hardware features 'forcing' nVidia to implement the said pixel shader optimizations.

EDIT: I'm not trying to say there is something wrong / inferior with FP24 in DX9, it's pure 'goodness' too. I'm trying to say that FP24 will be gone in DX10 (well, mebbe DX11) and all will be happy FP32 then. Of course, none of the current cards will be DX10 or DX11 compliant, so there really is no difference now.
__________________
no sig.

Last edited by aapo; 07-24-03 at 07:57 PM.
aapo is offline   Reply With Quote
Old 07-24-03, 07:58 PM   #72
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Thumbs up I love happy endings.

Quote:
Originally posted by aapo
Yes, but as I've said before, the data type is always 32 bits even in ATi's FP24 mode. The MS reference rasterizer uses 32 bits with FP32 precision, so nVidias FP32 is goodness and that's why they implemented it. The only evil thing is FP16 (now that FX12 is maybe gone), and it's used only because the nVidia drivers force it in order to compete with ATi. There's nothing wrong with the NV3X hardware besides the fact it's slower than ATi's R3X0. nVidias problem are mainly in their PR and DCMM teams (Driver-Cheating Monkey-Men).
I have no argument with that at all, in fact I agree with it.

Quote:
Yup, but my original post quoted here wasn't in this thread. It was a discussion about "can nvidia ever remove these and other 'optimizations' from their drivers even if they wanted to?", which obviosly was connected to hardware features 'forcing' nVidia to implement the said pixel shader optimizations.
Yup, and your post sort of fit over there...although it STILL didn't have enough "forcing FP16 by application detection is a no-no though" for my personal liking.

But it doesn't apply to this debate, he's trying to side-track the issue.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Responds to Reports of Kepler V-Sync Stuttering Issue Rieper NVIDIA GeForce 600 Series 13 03-03-13 10:56 PM
Gorgeous Unreal Engine 4 brings direct programming, indirect lighting News Archived News Items 0 06-08-12 09:20 PM
Star Wars 1313 running on Unreal Engine 3 on PC at E3, will be linear and light on Je News Archived News Items 0 06-08-12 05:20 AM
Intel's Ivy Bridge Core i7 3770K Overheating Issue Detailed News Archived News Items 0 05-16-12 10:40 AM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM

All times are GMT -5. The time now is 07:21 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.