PDA

View Full Version : Full Trilinear Filtering?


rewt
05-11-04, 02:49 PM
Does the FX 5950 support full trilinear filtering as does the Geforce 6800? I always wondered if it was a limitation on the driver, or the hardware itself.

Thanks for any replies!



rewt

Lfctony
05-11-04, 02:54 PM
Brilinear only I think. Somebody help me out on this?

Toaster
05-11-04, 02:57 PM
It has full support for trilinear in HW, it's the drivers don't allow you to use it and force brilinear upon you.

rewt
05-11-04, 04:29 PM
Ah, I kinda figured it was a driver limitation but not 100% sure. Does anyone know which drivers was Brilinear filtering first introduced? Was it Det 44.03 or newer Forceware drivers?

The option is there for the Geforce 6800 on forceware 60.72 drivers to enable full trilinear and remove optimizations, but why not there for the 5950? It says enable trilinear, but doesn't remove optimizations...

Anyways, how could we remove this limitation on FX cards. I know full trilinear wouldn't noticably hurt performance especially on top of the line FX cards...

Would a tool like RivaTuner or nVhardPage allow forcing true Trilinear?

rewt
05-11-04, 10:48 PM
According to sources, 43.45 was the last driver to allow full trilinear filtering on the FX line of cards. Unfortunately it didn't work with my 5950, so I was unable to test it out. =(

All newer drivers use some sort of mix between bilinear and trilinear. Bit from what I hear, the image differences aren't even noticed for the most part.

Lfctony
05-12-04, 03:35 AM
Yes, unlike ATI's AF, its really hard to notice. With ATI, when setting 16xAF from outside Farcry (ie CP) , I can notice the difference on the floor, between one layer stage and another. Don't think I even noticed that difference with my 5900.

jbirney
05-12-04, 08:26 AM
Yes, unlike ATI's AF, its really hard to notice. With ATI, when setting 16xAF from outside Farcry (ie CP) , I can notice the difference on the floor, between one layer stage and another. Don't think I even noticed that difference with my 5900.

Once you know where to look its easy to pick out brilinear. Thanks for dragging ATI into this thread when it had nothing to do with them :rolleyes:

saturnotaku
05-12-04, 08:33 AM
Thanks for dragging ATI into this thread when it had nothing to do with them :rolleyes:

Come off it. Even you would have to admit that the "Quality" mode on ATI's CP is misleading. You would think that quality would enable full trilinear, but it doesn't. Only if you let the application control AF do you get full trilinear.

Bottom line, it doesn't work on NV3x cards and you can only get it on ATI if you let the application control AF...which is assuming it has an application control to begin with.

4q2
05-12-04, 09:47 AM
Come off it. Even you would have to admit that the "Quality" mode on ATI's CP is misleading. You would think that quality would enable full trilinear, but it doesn't. Only if you let the application control AF do you get full trilinear.

Bottom line, it doesn't work on NV3x cards and you can only get it on ATI if you let the application control AF...which is assuming it has an application control to begin with.

It is misleading and should be changed.

At least with ATI you can use Rtool or other 3rd party programs to change it to full trilinear.

saturnotaku
05-12-04, 09:48 AM
It is misleading and should be changed.

Or at least add a "high/full quality" option to the control panel.

Lfctony
05-12-04, 09:52 AM
Once you know where to look its easy to pick out brilinear. Thanks for dragging ATI into this thread when it had nothing to do with them :rolleyes:

I was simply comparing the 2. I didn't notice with my 5900, I did notice with my 9800P. Thus, to me, Brilinear is less noticable than 1 LayerTrilinear/Bilinear onwards filtering.

4q2
05-12-04, 10:05 AM
Yes, unlike ATI's AF, its really hard to notice. With ATI, when setting 16xAF from outside Farcry (ie CP) , I can notice the difference on the floor, between one layer stage and another. Don't think I even noticed that difference with my 5900.


It may be more noticeable on the ATI card because the 5900 is displaying lower quality floors and that may mask the effect a bit.

At least Farcry allow you to set the AF from within the game, unlike many other games.

So if you want full AF in Farcry on ATI, set application in the control panel, then set the desired AF from within the game.

jbirney
05-12-04, 10:08 AM
Come off it.

Was the original posters Question have anything to do with ATI? Did he/she even mention them? Did he ask about a comparison. NO it did NOT. So why even post with that attuide? You guys complian about the quality of the post here and yet help to add to the noise.

Even you would have to admit that the "Quality" mode on ATI's CP is misleading. You would think that quality would enable full trilinear, but it doesn't. Only if you let the application control AF do you get full trilinear.

Thats a complety differnt discussion that should have be taken up else where (non-Nvida card forum). But since you wanted to know I have hear discussion on both sides. I think that the app should have the final say as that represents what the developer wanted more closely. Remember its their game and they know how it "should" look. So even though Qaulity is not the best fit of the word, its still offers HIGHER IQ then their other setting so, in that sense, it works.


I was simply comparing the 2. I didn't notice with my 5900, I did notice with my 9800P. Thus, to me, Brilinear is less noticable than 1 LayerTrilinear/Bilinear onwards filtering.

Funny I saw it the other way but to each their own....

Blacklash
05-12-04, 10:23 AM
The reason I considered Ati to have an IQ edge in the past was AA, not AF. Now that Nvidia has much better AA than they did, that is no longer an issue. Ati 'saves' in the hardware and Nvidia through the driver, brilinear. You should be able to use full tri linear by switching off the optimizations through the CP in the 60.xx drivers. Most reviews I have seen spot brilinear through using an analyzer and not the naked eye. Most claim they are hard pressed to find a difference.

This rip from 3dcenter explains bit weight differences between Ati and Nvidia. Nvidia has a higher standard. People think otherwise because of relentless fanboy FUD.

"A trilinear filter is a linear interpolation between two bilinear samples, requiring a weight between 0 and 1. ATI allocate five bits for this weight, which matches Direct3D's reference rasterizer (however, higher precision is allowed by Direct3D and in fact desirable). In OpenGL, SGI - who spearheaded the inception of this API - use eight bits. That's also the standard that's followed by, eg, Nvidia's GeForce range that implements the 8 bit linear interpolation weight for both OpenGL and Direct3D. These three additional bits result in an eightfold increase in definition. Do we "need" that? In our opinion, at least six bits of "LOD fraction" are desirable to minimize banding artifacts. Five bits are okay for most cases, while four bits are definitely too few. Eight bits may be slightly overkill but then there's no disadvantage to precise texture filters. Anyway, textbook quality is eight bits, this guarantees zero banding and also constitutes SGI's recommendation for OpenGL."

Lfctony
05-12-04, 10:30 AM
As I have always told to people.

Get ATI for D3D, Nvidia for OpenGL. ATI for AA IQ, Nvidia for AF IQ. Plain and simple.

rewt
05-12-04, 05:13 PM
As I have always told to people.

Get ATI for D3D, Nvidia for OpenGL. ATI for AA IQ, Nvidia for AF IQ. Plain and simple.

Good advice. But who knows, with the release of the new cards this may change. Could be that one card dominates in both worlds this time around. Guess we'll just have to wait and see. I have always liked nV for two main reasons. Stability, as well as the ability to increase 'Digital Vibrance' using the CP. Makes desktop/pictures/games look 200% better than standard color, IMHO.