Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 7, 8, And 9 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-23-03, 01:30 PM   #37
Matthyahuw
Registered User
 
Matthyahuw's Avatar
 
Join Date: Jul 2002
Location: AZ
Posts: 919
Send a message via ICQ to Matthyahuw Send a message via AIM to Matthyahuw Send a message via Yahoo to Matthyahuw
Default

Quote:
Originally posted by creedamd
I can't believe that a technical editor made such a blind comment. Wow. Just wow.
LOL
You guys haven't gotten it by now? I'm just trying to play devil's advocate as everyone else is playing the "bash nvidia" game. I always root for the underdog, even if it's wrong...

Someone else has to play the other side, you can't have a one-sided arguement, it would just make a big forum full of rants, who wants to read a bunch of rants? I know I don't...
A few months ago I used to read EVERY post here (as I used to be a mod), but I can barely bring myself to read 5% of the posts now, it's all the same thing: nVIDIA cheated, they are 'teh suck'

It's old, and just like no one in the real world cares about not finding the WMD as long as saddam's gone, no one cares about 'optimizaions' as it looks just fine to 99.99999% of us...just like Brent and Kyle, they came to the same conclusion...
__________________
Shalom!
Matthyahuw is offline   Reply With Quote
Old 07-23-03, 01:44 PM   #38
Deathlike2
Driver Reinstall Addict
 
Join Date: Apr 2003
Location: Nowhere Near NVidia or NVNews
Posts: 336
Default

I'm not planning to side if the UT2003 optimizations are valid..

However these optimizations CANNOT be used for benchmarking.. an apples to apples comparison has to be made..

HardOCP cannot justify the benchmark values to be valid in comparison to ATI's benchmark values... full trilinear MUST BE compared to full trilinear

Every other analysis in that UT article was ok.... it didn't put into perspective what these optimizations validate their benchmarks...

As it gets fuzzy over MSAA or AF implementation.. the numbers you are judging against SHOULD ALWAYS be backed up by pictures of the in-game (videos if necessary) AND the control panel settings used (making sure application mode is checked or something like that)..

What people see is pretty subjective.. what people can force on their hardware is objective...

As one of the conclusions of the articles suggest.. it should've been an option to "enable these optimizations".. and it shouldn't be forced upon you by NVidia..

Quality = Quality = Quality
and
Trilinear = Trilinear = Trilinear

No other questions asked...
__________________
PR = crap
War Against FUD
What is FUD? - http://www.geocities.com/SiliconValley/Hills/9267/fuddef.html

Last edited by Deathlike2; 07-23-03 at 01:47 PM.
Deathlike2 is offline   Reply With Quote
Old 07-23-03, 03:11 PM   #39
creedamd
 
creedamd's Avatar
 
Join Date: Oct 2002
Posts: 597
Default

Quote:
Originally posted by Matthyahuw
LOL
You guys haven't gotten it by now? I'm just trying to play devil's advocate as everyone else is playing the "bash nvidia" game. I always root for the underdog, even if it's wrong...

Someone else has to play the other side, you can't have a one-sided arguement, it would just make a big forum full of rants, who wants to read a bunch of rants? I know I don't...
A few months ago I used to read EVERY post here (as I used to be a mod), but I can barely bring myself to read 5% of the posts now, it's all the same thing: nVIDIA cheated, they are 'teh suck'

It's old, and just like no one in the real world cares about not finding the WMD as long as saddam's gone, no one cares about 'optimizaions' as it looks just fine to 99.99999% of us...just like Brent and Kyle, they came to the same conclusion...
What is happening is what pushes better hardware. I am sure that is what you want right?? I get excited when technology advances. Right now Nvidia is behind Ati, they are trying every little trick to "simulate" that they are hand and hand. Even getting to some of the review sites. Forums are the best place to find out these little tricks and discuss them.

A statement like you made does not help nvidia but actually just damages your image. I know that a lot of posts here are pro-ati, but it's because facts back them up. We all wish nvidia would get back into the game to have something to brag about as well.
__________________
System 1: 2500xp@3200|1gigHyperXPC4000|AbNF7s|Fortissimo7.1|SonyDJV700|DvdR+&CDRW|160gbHD
|9800pro|21"IBM-P260

System2: 2500xp@3200|abitNF7-s|512XMS|9700pro|160Gbhd

System3: 2400xp|512xms|Epox 8rda+|9500pro
creedamd is offline   Reply With Quote
Old 07-23-03, 04:07 PM   #40
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

Quote:
Originally posted by creedamd
A statement like you made does not help nvidia but actually just damages your image. I know that a lot of posts here are pro-ati, but it's because facts back them up. We all wish nvidia would get back into the game to have something to brag about as well.
Actually, his comment is one of the most rational in this thread, and probably bolsters his image if anything. IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card. They are useful for those curious about how each card handles a particular scene, filtering method, etc, but as for in game action, they are useless. Why? Because if you can't see it with your naked eye while playing the game, then how can you make the argument that the IQ differences would make a difference in the games' graphical quality? Again, the idea behind a game is to actually play it, not pick it apart as if it was some sort of digital art masterpiece, then wax and wane about which card does the best filtering in scene X at time Y when you zoom in at 4x and there is a full moon outside. As for 'facts' backing up posts here, the basic theorem behind many of them is flawed. And, to address your last point, Nvidia got back in the game once the FX5900 hit the street.

Re: why did I bring up Ultrashadow? Because its an optimization ATI could likely make in software, and pertained to the analogy I made in that post. No 'backup arguments' needed, as the original sticks - IQ differences that can't be seen while actually playing are meaningless to the game player.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-23-03, 04:13 PM   #41
creedamd
 
creedamd's Avatar
 
Join Date: Oct 2002
Posts: 597
Default

You have to be kidding, there is night and day difference between bilinear and trilinear. You obviously have a hidden agenda, or just uninformed to believe what you say. The 5900 has had many fallbacks to hardly call it competition for ati at this point. I am not going to point them out because it would be beating a dead horse and there is proof everywhere, search yourself, I have, obviously you need to spend some time and do so as well.
__________________
System 1: 2500xp@3200|1gigHyperXPC4000|AbNF7s|Fortissimo7.1|SonyDJV700|DvdR+&CDRW|160gbHD
|9800pro|21"IBM-P260

System2: 2500xp@3200|abitNF7-s|512XMS|9700pro|160Gbhd

System3: 2400xp|512xms|Epox 8rda+|9500pro
creedamd is offline   Reply With Quote
Old 07-23-03, 04:24 PM   #42
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by Ruined
[b]Actually, his comment is one of the most rational in this thread, and probably bolsters his image if anything. IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card. They are useful for those curious about how each card handles a particular scene, filtering method, etc, but as for in game action, they are useless. Why? Because if you can't see it with your naked eye while playing the game, then how can you make the argument that the IQ differences would make a difference in the games' graphical quality? Again, the idea behind a game is to actually play it, not pick it apart as if it was some sort of digital art masterpiece, then wax and wane about which card does the best filtering in scene X at time Y when you zoom in at 4x and there is a full moon outside. As for 'facts' backing up posts here, the basic theorem behind many of them is flawed. And, to address your last point, Nvidia got back in the game once the FX5900 hit the street.
This post deserves a repost of this picture



Even Behemoth and Nv40 could tell the difference between these 2 shots
reever2 is offline   Reply With Quote
Old 07-23-03, 06:19 PM   #43
extreme_dB
Registered User
 
Join Date: Jun 2003
Posts: 337
Default

Quote:
Originally posted by Ruined
Re: why did I bring up Ultrashadow? Because its an optimization ATI could likely make in software, and pertained to the analogy I made in that post. No 'backup arguments' needed, as the original sticks - IQ differences that can't be seen while actually playing are meaningless to the game player.
Instead of repeating myself, I'll just try a different angle now. Why are you generalizing everyone as "the game player"? Are not the people who are complaining, game players?

Everyone has their own subjective preferences. Why are they being forced to make a slight ("unnoticeable") trade off for extra performance if they want uncompromising quality?

It's no big deal to you, but obviously it's a big deal to other people. What makes you right? Why can't everyone play the way they want within the hardware's capabilities?

We're arguing for choice, and you're arguing for "it's good enough the way it is", just so Nvidia can look better in benchmarks.
extreme_dB is offline   Reply With Quote
Old 07-23-03, 06:39 PM   #44
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 6,681
Default Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
So, basically, it appears the optimization many have been complaining about is a non-issue.
Sssshhhh. Let's pretend this never happened
MikeC is offline   Reply With Quote

Old 07-23-03, 07:03 PM   #45
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by MikeC
Sssshhhh. Let's pretend this never happened
a man of few words... but damn... when you post... it hurts... my guts

thats a LOW blow...

/me loves it..

Sazar is offline   Reply With Quote
Old 07-23-03, 08:41 PM   #46
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Ruined
Re: why did I bring up Ultrashadow? Because its an optimization ATI could likely make in software, and pertained to the analogy I made in that post. No 'backup arguments' needed, as the original sticks - IQ differences that can't be seen while actually playing are meaningless to the game player.
Sure they might be able to do it in software, but it would almost certainly not function. Ultrashadow increases performance but does not change IQ. Doing it in software would probably be slower than trying to render an image without it.

As for you continued assinine comments, there have been screenshots posted that weren't zoomed in at showed the issue perfectly. It is exactly this point that begs asking the question WTF Brent was thinking? He didn't post any shots that showed the issue clearly. Why not?

He didn't compare the gfFX UT2003 filtering hack with ATI's full trilinear. Again, why not? Is he just incompetent? Or did he want to make the issue look as much as a non-issue as possible?
  Reply With Quote
Old 07-23-03, 11:53 PM   #47
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

StealthHawk: No personal attacks please.

Again, the guys at HardOCP pitted an FX5900 with 44.03 drivers against 9800PRO with latest drivers and could see zero difference in-game. Regardless of the filtering Nvidia is using with the FX5900 and 44.03 drivers, it looks no different than the filtering that is being used on the 9800PRO when you are actually playing the game according to a pro site that was specifically looking for such differences in gameplay. 'nuff said. reev: Pictures from an article expounding on an FX5600 issue should not be generalized to the entire FX line, and articles that only address how zoomed in screenshots look compared to one another are nice, but I'm more interested in how the game actually looks and plays, which is what the HardOCP article addressed. I'd read the HOCP article again, it actually makes quite a bit of sense - it's more concerned with the big picture. Remember, HOCP is the same site that blasted Nvidia for IQ several times within the past few months, gave poor reviews to FX5600 cards, etc, so it's difficult to come up with a conspiracy theory or bias argument for this particular site.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64

Last edited by Ruined; 07-24-03 at 12:09 AM.
Ruined is offline   Reply With Quote
Old 07-24-03, 12:12 AM   #48
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Ruined
StealthHawk: No personal attacks please.

Again, the guys at HardOCP pitted an FX5900 with 44.03 drivers against 9800PRO with latest drivers and could see zero difference in-game. Regardless of the filtering Nvidia is using with the FX5900 and 44.03 drivers, it looks no different than the filtering that is being used on the 9800PRO when you are actually playing the game according to a pro site that was specifically looking for such differences in gameplay. 'nuff said. reev: Pictures from an article expounding on an FX5600 issue should not be generalized to the entire FX line, and articles that only address how zoomed in screenshots look compared to one another are nice, but I'm more interested in how the game actually looks and plays, which is what the HardOCP article addressed. I'd read the HOCP article again, it actually makes quite a bit of sense - it's more concerned with the big picture. Remember, HOCP is the same site that blasted Nvidia for IQ several times within the past few months, gave poor reviews to FX5600 cards, etc, so it's difficult to come up with a conspiracy theory or bias argument for this particular site.
Kyle, why don't you just log in under your usual FrgMstr?

(SORRY! But I want to point out in me own defense that I HAVE restrained meself until now.... )
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Responds to Reports of Kepler V-Sync Stuttering Issue Rieper NVIDIA GeForce 600 Series 13 03-03-13 10:56 PM
Gorgeous Unreal Engine 4 brings direct programming, indirect lighting News Archived News Items 0 06-08-12 09:20 PM
Star Wars 1313 running on Unreal Engine 3 on PC at E3, will be linear and light on Je News Archived News Items 0 06-08-12 05:20 AM
Intel's Ivy Bridge Core i7 3770K Overheating Issue Detailed News Archived News Items 0 05-16-12 10:40 AM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM

All times are GMT -5. The time now is 04:18 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.