PDA

View Full Version : Psuedo-Trilinear filtering aka 3DCenter's "Brilinear filtering"


Pages : [1] 2 3

Deathlike2
11-02-03, 07:36 PM
http://www.3dcenter.org/artikel/2003/10-26_a_english.php

At least I understand why there are horrible mipmap transitions....

When the guys say that the original Geforce has more quality than the FX in AA AND Aniso.. you wonder...

StealthHawk
11-02-03, 07:49 PM
Originally posted by Deathlike2
When the guys say that the original Geforce has more quality than the FX in AA AND Aniso.. you wonder...

Well, no. 4x OGSS may be superior to 4x OGMS, but 8xS is superior to 4x OGSS.

AnteP
11-02-03, 09:26 PM
Originally posted by StealthHawk
Well, no. 4x OGSS may be superior to 4x OGMS, but 8xS is superior to 4x OGSS.

besides the original GF only did 2x AF not what I would call superiour, "brilinear" or not

Deathlike2
11-02-03, 11:28 PM
True.

I'm not exactly saying the new stuff is altogether worse than the previous generation (more aniso options are a good thing for the GF3 and beyond).. but the author's conclusion is interesting...

We definately have improvements in the quality these days... but as much as NVidia is trying to move forward.. they are taking a step back at the same time...

GF3 vs GF2 (more aniso for the GF3, but Quincuix isn't all that cracked up to be)

FX vs GF4 (more performance, but lower quality with the Det FX and beyond)

Ruined
11-03-03, 12:34 AM
Funny that despite the critcisims the writer of the article stated that the filtering optimization in UT2003 was not noticable during gameplay... Though I do hear his concerns about not having the option to disable the filter if you are a purist.

I also found it interesting that the GeForceFX has some sort of hardware support that is required for the filter that prior Nvidia cards do not, hence it was actually a planned feature/optimization as opposed to a driver compromise.

Deathlike2
11-03-03, 02:12 AM
It depends on the person.. and the game...

What bothers me the most is that NVidia is trying to "force" an "incomparable benchmark comparison" with their psuedo-tri.. let alone letting the USER (which apparently they don't care about the consumer) decide whether they want it or not.... but hey.. they obviously don't have to care... either they fall the way of 3dfx (in terms of late product launch) or they fix it in the next card...

Ruined
11-03-03, 02:19 AM
Originally posted by Deathlike2
It depends on the person.. and the game...

What bothers me the most is that NVidia is trying to "force" an "incomparable benchmark comparison"

This particular filter is nothing in terms of making an apples-to-apples benchmark comparison compared to the fact that ATI has way different AA/AF algos and FP24 shaders while Nvidia has way different AA/AF algos and FP16/FP32 shaders - not to mention both handle shader code much differently. Those two massive differences alone make it impossible for an apples-to-apples comparison, nevermind a minor thing such as a hybrid trilinear filter. In other words, even if the filter was disabled, you'd be one step closer to an apples-to-apples comparison that is a mile away.

with their psuedo-tri.. let alone letting the USER (which apparently they don't care about the consumer) decide whether they want it or not.... but hey.. they obviously don't have to care... either they fall the way of 3dfx (in terms of late product launch) or they fix it in the next card...

Well, I don't know about that. 3dfx failed for a number of major reasons, but one of them probably wasn't that their retail cards didn't meet the enthusiast market's demands. I'd have to guess that by not having 32bit color they did not meet all the OEM "checkboxes" so they lost out on OEM sales, they lost a ton of card distribution partners and simply couldn't compete by themselves, and they lost out big both time and moneywise on the SEGA Dreamcast deal. All of this forced them to release cards that probably cost too much to make and lost them a lot of money in the long run, plus they failed to market them well. ATI survived for years with garbage 2D video cards - simply because they marketed them well.

The fact that 72% of the value DirectX9 market went to Nvidia last quarter, the market that probably moves by far the most video cards, plus with Nvidia tripling their marketshare in the enthusiast DX9 segment shows that customers probably aren't overly concerned with this optimization. If HardOCP, Firing Squad, and 3D Center say they can't really notice a difference, even if it depends on the person, do you think the average guy who buys a video card (and sub $200 video card sales make up the majority of retail sales) will have any clue that this optimization is in place? Do you think the guy who reads PC Magazine instead of this forum will detect the optimization? I seriously doubt it. Even if you think GeForceFX stinks, Nvidia was able to sell a boatload of them.

john19055
11-03-03, 03:48 AM
I still think that nvidia is takeing a step back,by not letting the comsumer be able to choose true trilinear,but I also think this problem will be solved in the next hardware that comes by nvidia.They had to work with what they had at this momment to make it have the best performance and I.Q, so certain games maintain a decent framerate with little degrade of I.Q.

StealthHawk
11-03-03, 05:49 AM
Originally posted by Ruined
Funny that despite the critcisims the writer of the article stated that the filtering optimization in UT2003 was not noticable during gameplay... Though I do hear his concerns about not having the option to disable the filter if you are a purist.

He said no such thing. Although of course I am not surprised that you would say that he did. Here's what he said:

For a fairly long time Nvidia forces FX-users to cope with "brilinear" filtering in UT2003. According to our tests, the actual quality difference is marginal. In this column, we admit to have a less technical point of view and claim every option, that sacrifices a very small amount of image quality for a noticeable performance increase, to be preferable per se. The most important word in this sentence is "optional".

Nowhere does that say it is not noticeable.

I also found it interesting that the GeForceFX has some sort of hardware support that is required for the filter that prior Nvidia cards do not, hence it was actually a planned feature/optimization as opposed to a driver compromise.

Explain this then. GeForceFX5800 is launched with Intellisample set to Balanced and filtering quality almost the same as it is now with 5x.xx drivers. NVIDIA then, after numerous driver revisions, finally sets "Application" as the default, which provides trilinear. Now they're back with brilinear being the default, except now it is impossible to get trilinear. Explain this?

os2
11-03-03, 05:51 AM
The way I see it, hardware companies only make compromises and cut corners when they have no hope of competing otherwise. Seems to me that NVidia hardware just can't keep up without resorting to doing this therefore NVidia go ahead and do it.

Makes you wonder why ATI don't do a similar thng? You never know. It could be because they don't have to.

I guess it helps when you play games "The way its meant to be played"!! But hey, its ok because it "was not noticable during gameplay" :)

Regards,

os2

AnteP
11-03-03, 06:59 AM
"was not noticable during gameplay" is a very subjective statement.
There are some games such as Halo and a couple of maps in UT2003 and so forth where it is indeed noticable and quite honestly I think it's pretty pathetic to ruin a feature as basic as trilinear filtering.
Racing games is also one area where it's noticable due to the nature of the textures and the deep view. (Which of course doesn't only apply to racing games but also games like Mafia etc.)

What will NV40 sport? "22" bit color ala 3dfx instead of real 32 bit color? ;)

These optimization might have been "valid" back in the day of GF 256 or TNT but they surely aren't today.

StealthHawk
11-03-03, 03:51 PM
Originally posted by os2
The way I see it, hardware companies only make compromises and cut corners when they have no hope of competing otherwise. Seems to me that NVidia hardware just can't keep up without resorting to doing this therefore NVidia go ahead and do it.

But brilinear barely improves performance over trilinear.

[quote]Makes you wonder why ATI don't do a similar thng? You never know. It could be because they don't have to.[/]quote]

ATI introduced texture stage optimization with AF before NVIDIA did.

I realize that this 3DCenter article has nothing to do with texture stage optimization with AF, but that is worth noting.

Nutty
11-03-03, 04:18 PM
Its crap. Pure and simple.

But brilinear barely improves performance over trilinear.
Why would they do it, if it isn't to improve performance? Doesn't make sense.

The way I see it, is it's done on FX cards, because FX cards need the extra speed.. however it manages to get this through Brilinear.

It better be gone by NV40 time, or I probably wont buy another nv card again.

StealthHawk
11-03-03, 05:52 PM
Originally posted by Nutty
Its crap. Pure and simple.


Why would they do it, if it isn't to improve performance? Doesn't make sense.

The way I see it, is it's done on FX cards, because FX cards need the extra speed.. however it manages to get this through Brilinear.

It better be gone by NV40 time, or I probably wont buy another nv card again.

From what I've seen(compare benchmarks between 45.23-trilinear, and 52.16-brilinear) and the scores are virtually identical. You may pick up a few frames from going to brilinear, but it should be a low single digit percentage gain. Hardly worth forcing on people.

OTOH, forcing texture stage optimizations on everyone(which NVIDIA is not doing anymore) gains absolutely monumental gains in performance...in applications where IQ decreases ;)

ChrisRay
11-03-03, 05:54 PM
Originally posted by StealthHawk
From what I've seen(compare benchmarks between 45.23-trilinear, and 52.16-brilinear) and the scores are virtually identical. You may pick up a few frames from going to brilinear, but it should be a low single digit percentage gain.


Ya the Difference between Performance/Quality Anistropic Filtering on my FX 5900 is virtually identical. Only Does High Performance Yield a performance gain.

AnteP
11-03-03, 05:59 PM
Originally posted by StealthHawk
From what I've seen(compare benchmarks between 45.23-trilinear, and 52.16-brilinear) and the scores are virtually identical. You may pick up a few frames from going to brilinear, but it should be a low single digit percentage gain. Hardly worth forcing on people.

OTOH, forcing texture stage optimizations on everyone(which NVIDIA is not doing anymore) gains absolutely monumental gains in performance...in applications where IQ decreases ;)

Actually it depens, in a few titles (UT2003 for an xample) performance is awefull with true trilinear on the FX boards.

UT2003 with a 5800 Ultra, 1024 with 8x AF
True trilinear:
77 fps
Brilinear:
140 fps

In those cases it is of course better to have the brilinear filtering than half the fps. But I mean c'mon forcing it in all applications, in most of which it's absolutely not neccesary, that's just stupid. :/

And of course like evryone else says: it should be a choice, not forced upon those investing 500 USD in a product. Especially not when the manufactuirer boasts with "high quality".

AnteP
11-03-03, 06:00 PM
Originally posted by StealthHawk
(compare benchmarks between 45.23-trilinear, and 52.16-brilinear)

remember: nVidia has NEVER done true trilinear in UT2003 with any driver for the FX
only way to get true trilinear is to use a pre det 50 and anti cheat detection

StealthHawk
11-03-03, 06:04 PM
Originally posted by AnteP
Actually it depens, in a few titles (UT2003 for an xample) performance is awefull with true trilinear on the FX boards.

UT2003 with a 5800 Ultra, 1024 with 8x AF
True trilinear:
77 fps
Brilinear:
140 fps

In those cases it is of course better to have the brilinear filtering than half the fps. But I mean c'mon forcing it in all applications, in most of which it's absolutely not neccesary, that's just stupid. :/

And of course like evryone else says: it should be a choice, not forced upon those investing 500 USD in a product. Especially not when the manufactuirer boasts with "high quality".

What driver are you using in that comparison to get those results?

AnteP
11-03-03, 06:16 PM
Originally posted by StealthHawk
What driver are you using in that comparison to get those results?

44.65 with and without rivatuners anti detect

you could also just check the old FX launch drivers where the "pplication" mode had performance that is just in line with those scores

StealthHawk
11-03-03, 07:45 PM
Originally posted by AnteP
44.65 with and without rivatuners anti detect

you could also just check the old FX launch drivers where the "pplication" mode had performance that is just in line with those scores

Your benchmarks use 8x AF though. This is not a fair comparison because anti-detect stops texture stage optimization, which is where the bulk of the increased performance comes from.

Most of the performance is not coming from the drop down to brilinear filtering(from trilinear), but from the fact that texture stages 1-7 are getting a maximum of 2x bilinear filtering.

If you test with AF off, or AF set to 2x, I think you will find the performance difference between trilinear on all texture stages and brilinear TS 0 + bilinear TS 1-7 will be much smaller.

AnteP
11-03-03, 07:59 PM
Originally posted by StealthHawk
Your benchmarks use 8x AF though. This is not a fair comparison because anti-detect stops texture stage optimization, which is where the bulk of the increased performance comes from.

Most of the performance is not coming from the drop down to brilinear filtering(from trilinear), but from the fact that texture stages 1-7 are getting a maximum of 2x bilinear filtering.

If you test with AF off, or AF set to 2x, I think you will find the performance difference between trilinear on all texture stages and brilinear TS 0 + bilinear TS 1-7 will be much smaller.

yup
but there still is a performance drop by 10% or so even with just trilinear

also the drop with true trilinear (no AF) with the old "application" setting was pretty large as well

Skuzzy
11-03-03, 08:38 PM
I have seen a lot of talk and complaints about how the various video card companies apply texture filtering and most of it is erroneous, in terms of understanding what specifically is required and when it is required by the DirectX specification.
You can talk about it from the user point of view all you like, but much of this talk seems to make either ATI or NVidia in the wrong, and quite frankly, I am not sure that is the case. At least from the DirectX specification perspective.

So, I am making this post, as more of tutorial about texture filtering in DirectX than about who is right or worng.

First, let's start with the actual texture stage filters that are available in DirectX.
-------------
D3DTEXF_NONE
Mipmapping disabled. The rasterizer should use the magnification filter instead.

D3DTEXF_POINT
Point filtering used as a texture magnification or minification filter. The texel with coordinates nearest to the desired pixel value is used. The texture filter to be used between mipmap levels is nearest-point mipmap filtering. The rasterizer uses the color from the texel of the nearest mipmap texture.

D3DTEXF_LINEAR
Bilinear interpolation filtering used as a texture magnification or minification filter. A weighted average of a 2x2 area of texels surrounding the desired pixel is used. The texture filter to use between mipmap levels is trilinear mipmap interpolation. The rasterizer linearly interpolates pixel color, using the texels of the two nearest mipmap textures.

D3DTEXF_ANISOTROPIC
Anisotropic texture filtering used as a texture magnification or minification filter. Compensates for distortion caused by the difference in angle between the texture polygon and the plane of the screen.

D3DTEXF_PYRAMIDALQUAD
A 4-sample tent filter used as a texture magnification or minification filter.

D3DTEXF_GAUSSIANQUAD
A 4-sample Gaussian filter used as a texture magnification or minification filter.
-------------

Now, that is all the texture filters that are available to the programmer in DirectX9 and the above apply to all texture levels, if you want. You will notice, there is no trilinear filter explicitly defined in DirectX. There never has been.
The only time trilinear is required, by DirectX, is for mipmap transitions. Any other time, it is an option. You will notice in the defination of the LINEAR filter, this is very clear.
Games that provide and option for trilinear filtering are bogus. The programmer cannot specifically ask for triinear filtering in DirectX. You can play a little game and call every texture stage a mipmap. Would be foolish, but it would force trilinear filtering.

Now, you might gather that using ANISTROPHIC filtering, by defination, precludes the use of LINEAR filtering. It does, for any given texture stage. In other words, you can only specifiy one texture filter per texture stage.
If you force ANISTROPHIC filtering on, then you cannot use any other form of filtering for the texture stage that it is being applied to.
There is only one way to use ANISTROPHIC filtering and LINEAR together, and that is during mipmap transitions. However, no programmer in their right mind (yes, this is my opinion) would ever use that combination as it would kill performance.

Now, for different types of textures, there are different rules, in some cases. For instance, if the texture is a volumn map, then the only filters required to be available are POINT and LINEAR. If the volumn map is being applied during a mipmap transition, then POINT filtering is required, and LINEAR becomes optional. If the programmer attempts to ask for ANISTROPHIC filtering for this type of texture, then it is quite legal for the driver/hardware to ignore this and revert to LINEAR, or even POINT filtering.


You with me so far? Good. So, NVidia disables trilinear. As long as it is used during mipmap transitions for non volumn texture maps, they are within the DirectX9 specification. ATI disables trilinear when anistrophic filtering is enabled. Well,..that's ok too and does not violate the DirectX9 specification, simply because you cannot apply multiple filters to single texture stages.

Now you can yell and scream about this all you want. It does not change a thing. NVidia and ATI are within the DirectX9 specifications, unless they violate one of the above. Questions?

AnteP
11-03-03, 08:44 PM
Originally posted by Skuzzy
Now you can yell and scream about this all you want. It does not change a thing. NVidia and ATI are within the DirectX9 specifications, unless they violate one of the above. Questions?

I have one: where did you see anyone complain because they weren't following DX spec when it comes to texture filtering?

From a strictly "technical" point of view nVidia and ATi can do whatever they decide fit. There's no law against breaking DX specs.

It's all about the end users.

Skuzzy
11-03-03, 08:51 PM
AnteP, I did not nor was even trying to stir the pot sir. Many of the posts about the texture filtering issues have read like they video card companies have been doing something wrong,..as in violating some guideline or spec.
I only point out that they are within the DX9 specifications and supplied the information so people could be a bit more informed from that perspective.
I understand the user side as well, but I thought this data would be helpful. Maybe I should not have worded that particular sentence as strongly as I did. Was not meaning to offend anyone.

AnteP
11-03-03, 09:07 PM
Originally posted by Skuzzy
AnteP, I did not nor was even trying to stir the pot sir. Many of the posts about the texture filtering issues have read like they video card companies have been doing something wrong,..as in violating some guideline or spec.
I only point out that they are within the DX9 specifications and supplied the information so people could be a bit more informed from that perspective.
I understand the user side as well, but I thought this data would be helpful. Maybe I should not have worded that particular sentence as strongly as I did. Was not meaning to offend anyone.

no problemo :afro2: