View Single Post
Old 07-24-03, 05:13 PM   #57
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

Quote:
Originally posted by StealthHawk
I assumed that you said that because you thought it pertained to the UT2003 issue, am I mistaken about this? You have been asserting all along that the issue is NOT visible, right? I am saying that the UT2003 IS visible, without said measures listed above. If I misconstrued your intent then I apologize. Otherwise I stand by my assessment.
I don't have an R9800PRO to side by side compare with an FX5900 on different monitors, so I can't personally prove that one card looks inferior to the other. I am going by HOCP's controlled A/B test, which clearly you disagree with, and was just making the point that if an optimization is minor enough to be only noticable when studying screenshots as opposed to playing, then I wouldn't consider it of significance. i.e. if you disagreed with the rationale behind the statement I posted, you'd be saying that IQ issues that you cannot see during gameplay are somehow harmful towards the game's image quality when you are playing.


Re: Brent saying he doesn't see a difference while playing. That statement is the same as a reviewer who says they think ATI's 4x AA looks better than Nvidia's 4x AA. In both cases, they are subjective, simply because all IQ evaluations of in-game action are subjective. I don't find it damaging, I find it the norm in IQ evaluations. You have every right to disagree with Brent, but that was his judgement when he had the opportunity to evaluate the cards side by side in a controlled testing environment. It doesn't mean he is automatically wrong, nor does it mean you are automatically wrong if you disagree. However, you probably should at least take his opinion into account - that even a pro reviewer looking for a difference could not detect it during actual gameplay. Brent never said 'HardOCP is right, every other site is wrong,' he just evaluated the situation at ATI's request, and posted his findings. It is your choice to agree or disagree with them.


Quote:

Please agree or disagree with the following statements:

Giving the user an IQ/performance tradeoff is a good thing.
Agree - I don't have the cash to buy a $400 video card every 6 months, so when games get too intense I'd like the option to tradeoff IQ for performance.

Quote:

There is something deceitful about changing filtering in one game and not telling anyone about it, especially when they have been led to believe that the Quality setting in the drivers would perform full trilinear filtering.

It is not a good thing that users who want full trilinear filtering are unable to get it.
I am going to lump these together. I can't make a judgement based on the information available to me. I'm not sure why Nvidia changed the filtering method of UT2k3 from the standard, but I really don't think its simply for benchmarking reasons - full trilinear shouldn't be such a major performance hit that it makes your card look horrible against the competition, unless there is a driver bug or incompatibility with that specific game.

ATI is basically taking the stance that the consumer should be able to customize everything. Nvidia is taking the stance that they will attempt to provide the consumer with the best IQ/performance for each game, which gives the consumer less options, but also gives the consumer less chance to have a decidedly worse IQ/performance. For instance, with ATI cards you can force AA on in Splinter Cell, but with Nvidia cards you cannot. ATI decided the consumer should be able to force the game into AA mode, while Nvidia used application detection to prevent AA from being enabled in that game (by request of Ubisoft). The difference? ATI users are the only ones who can use AA, but they are also the only ones who could be exposed to the massive IQ glitches seen in that game with AA enabled.

So in essence, ATI gives the user more control, but Nvidia gives the user what they believe is the best possible experience for that game. The former may lead to happier tweakers, the latter may lead to happier everyday consumers/casual gamers. Two very different approaches.

Quote:
NVIDIA should provide an option in the drivers that does real trilinear filtering if the application asks for it.
Depends on the situation.

If there is an underlying bug/issue with the drivers or hardware that is problematic with a particular application, which may be the case with UT and the 44.03 drivers, I don't think giving that option would be productive. In addition, by having Nvidia automatically control the IQ/advanced gfx settings with the drivers, there will likely be less glitches, performances issues, or need to switch around settings in between games in order to get them to run their best on the hardware. The Nvidia method allows you to install the drivers, set them to highest quality, and if they are the latest, play the game with the best settings that are not problematic/incompatible with the hardware/drivers - no need for switching settings around between games to get the best quality without IQ glitches/performance problems. Makes life easier on the gamer, though tweakers may be upset by the lack of control.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64

Last edited by Ruined; 07-24-03 at 05:27 PM.
Ruined is offline   Reply With Quote