View Single Post
Old 07-23-03, 09:22 AM   #35
extreme_dB
Registered User
 
Join Date: Jun 2003
Posts: 337
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
Well what about the Doom3 UltraShadow example? Not drawing shadows that are out of the line of sight is something that Raedon cards should be able to handle, but they will not be able to do so in their drivers since Nvidia has patented the technology, a technology that will likely be application specific and rely on drivers for this subjectively non-discernable optimization - there certainly isn't an 'ultrashadow chip' on nvidia cards. There is no option to disable Ultrashadow either. Is this new feature also a big problem?
Can you explain why the 9800Pro was faster than the 5900U in Doom3 high-quality mode wih older, unoptimized drivers that only used half its memory? The 9800Pro is competitive with a lower clock and less memory bandwidth. Does ATI's architecture sound less impressive when it's even rendering at higher precision?

Anyway, the argument you brought up is irrelevant. It doesn't matter what method an architecture uses to render an image. What does matter is that it's rendering what an application asks to the best of its ability (you'd know its limitations before you buy it) and that you're not being artificially limited by what you can do to finetune the image to your liking. If ultrashadow has zero effect on IQ, then there's no need to enable a user-option for it if it's only beneficial.

Nvidia advertises FP32 and cinematic computing, but then does everything they can to lower precision so it runs acceptably on their architecture. The NV30-34 goes below the industry-standard DX9 requirement for full-precision by translating everything to half-precision. This is the kind of thing we're talking about.

ATI does not advertise FP32 and then fail to deliver it. ATI did falsely advertise supersampling and piss off a lot of people, but they're not fooling people by making them think that SSAA is actually now available and then doing something else when the user selects it, as what Nvidia is doing with trilinear filtering.

Nvidia misleads reviewers by demonstrating that they do full trilinear with a synthetic test, all the while having it disabled in UT2003. Most consumers wouldn't notice if it hadn't been discovered by certain people investigating IQ. The fact that it's not noticeable is not the point. Most (all?) people didn't notice the Quack issue either until Nvidia tipped off review sites. ATI users wouldn't be aware that the IQ is supposed to be better than it actually was.

Last edited by extreme_dB; 07-23-03 at 09:27 AM.
extreme_dB is offline   Reply With Quote