PDA

View Full Version : why have features if you can't use them


phye2002
07-09-03, 04:01 PM
I was thinking, and came to a conclusion i'm not sure everyone shares. All current video cards support trilinear filtering. If that is the case, why can't you use it on the FX series? Also DX9 cards are required to run at a minimum of 24 floating point precision. Why does the FX series knock the floating point precision down to 16fp? Are we letting software decide what level of quality we want to play our games at? Just curious, Am I wrong to come to these conclusions?

rokzy
07-09-03, 04:20 PM
because the marketing department is more concerned wih hype and buzzwords than actual performance.

like the FX5200 is a "DX9 card" and hence desirable new technology, despite being unusable in that role.

Nv40
07-09-03, 05:46 PM
well for the people that says that ATI do "full trilinear".. in games..
here is aquamark 8xAF ..Hq

http://vann77.freewebpage.org/pictures/ati_aquamark_AF1.jpg


http://www.computerbase.de/article.php?id=237&page=4&%20%20sid=b4a3f395e788a0f6425e7f056f76b7ab

according to that german site ATI is droping their aniso to bilinear.in latest
CAT3.5 drivers..just look at the other screenshots.. no yellow markers needed or funky colors ,no photoshop editing to notice the diferences :rolleyes:

so yes.. i agree.. when you select TRilinear settings ,VIdeo cards should do what it is asked ,if we want performance ,we already have settings for that.

rokzy
07-09-03, 07:47 PM
Originally posted by Nv40
well for the people that says that ATI do "full trilinear".. in games..
here is aquamark 8xAF ..Hq

http://vann77.freewebpage.org/pictures/ati_aquamark_AF1.jpg


http://www.computerbase.de/article.php?id=237&page=4&%20%20sid=b4a3f395e788a0f6425e7f056f76b7ab

according to that german site ATI is droping their aniso to bilinear.in latest
CAT3.5 drivers..just look at the other screenshots.. no yellow markers needed or funky colors ,no photoshop editing to notice the diferences :rolleyes:

so yes.. i agree.. when you select TRilinear settings ,VIdeo cards should do what it is asked ,if we want performance ,we already have settings for that.

AFAIK you don't get trilinear if you set to quality, but do if you set to application preference then use trilinear in the game

StealthHawk
07-09-03, 08:34 PM
Originally posted by rokzy
AFAIK you don't get trilinear if you set to quality, but do if you set to application preference then use trilinear in the game

Yes, and this is something that ATI has not hidden from consumers either.

deejaya
07-09-03, 09:31 PM
Originally posted by StealthHawk
Yes, and this is something that ATI has not hidden from consumers either.

I dunno, I didn't know this. Wasn't like it had this written on my 9700 box. I really did assume that by putting it on the Quality tab, it would use Trilinear. Maybe I'm the only one in the world, though. :)

Ninja Prime
07-09-03, 09:55 PM
Bah, nm this was about old hardware.

extreme_dB
07-09-03, 10:50 PM
I read that ATI changed the quality setting from the 3.1 drivers so that it does a mxture of trilinear/bilinear (trilinear on the base texture only). For full trilinear, you have to set it to "application".

This might've been a response to Nvidia's quality settings/optimizations, which could be an example of how Nvidia is leading the industry to lower filtering quality as a result of current benchmarking practices, at least for this round.

CaptNKILL
07-09-03, 10:55 PM
Originally posted by phye2002
why have features if you can't use them

Bragging rights....

Hanners
07-10-03, 06:50 AM
Originally posted by deejaya
I dunno, I didn't know this. Wasn't like it had this written on my 9700 box. I really did assume that by putting it on the Quality tab, it would use Trilinear. Maybe I'm the only one in the world, though. :)

They didn't tell you because you didn't ask. :p

Seriously though, I think it's something that ATi should have made more apparant, and hopefully will do once they restructure their AF slider in the control panel (which they are supposedly looking into presently).

The difference is that you can still use total trilinear on ATi cards by either using the aniso settings in the game (if available)and using application preference in the ATi control panel, or changing one of the registry keys the drivers use. With nVidia and UT2003, you have no choice, it forces you to use the same aniso settings no matter what you select in the game and/or drivers.

deejaya
07-10-03, 07:58 AM
Originally posted by Hanners
The difference is that you can still use total trilinear on ATi cards by either using the aniso settings in the game (if available)and using application preference in the ATi control panel, or changing one of the registry keys the drivers use. With nVidia and UT2003, you have no choice, it forces you to use the same aniso settings no matter what you select in the game and/or drivers.

Yeah, it doesn't bother me. Depending on the speed difference I'd probably drop trilinear anyway if it's anything like the shots I saw for UT2k3. But still, quality should mean quality, both drivers are very misleading about this, IMO.

Anyone know if nVidia do this across the board for all games, or just UT2k3?

Hanners
07-10-03, 08:48 AM
Originally posted by deejaya
Anyone know if nVidia do this across the board for all games, or just UT2k3?

As far as I know it's only been found in UT2003, I don't imagine the same issue will be present in many (if any) other games - UT2003 (and Unreal 2) are special cases because of the much larger performance hit they give with aniso enabled compared to most games.

StealthHawk
07-10-03, 08:03 PM
Originally posted by Hanners
As far as I know it's only been found in UT2003, I don't imagine the same issue will be present in many (if any) other games - UT2003 (and Unreal 2) are special cases because of the much larger performance hit they give with aniso enabled compared to most games.

I would imagine that any games that use Detail Textures are subject to the same optimization as seen in UT2003.