PDA

View Full Version : Some DX10 questions


nightmare beta
09-01-07, 03:48 AM
I have some DX10/10.1 questions.

I read a Call of Juarez feature at HardOCP, and I was kind of alarmed.

1. 4X AA is going to be a requirement for DX10.1 I heard, and hard ocp used a 4X in game setting. It made no mention of Transparency super sampling. The X360 only having just 4x MSAA without TRSS is a glaring flaw. Is microsoft going to push for just 4X MSAA without TRSS with DX10/10.1? They seem to be trying to x360-ize PC games, and it's really disturbing. So i wouldn't be too surprised if we went backwards with AA. But i'd like to know for sure, so that's why i am asking.

2. The ATI and Nvidia screenshots/IQ were said to be identical. They did look identical, but how the hell can they be identical if nvidia's AF is so much better? Is DX10/10.1 going to drop support for AF, since the X360 doesn't ues it?

evox
09-01-07, 04:57 AM
I don't think MS had anything to do with the game, it would be foolish to cripple a game's graphics just because it's multi platform. esp a DX10 game. (Vistaaa!) furthermore, the performance of CoJ in DX10 is already horrible as it is, forcing 8x SSAA will turn the game into a Powerpoint slide. also, SSAA and others are never present in game menus (AFAIK) you always have to force them from the control panel. CoJ wasn't a port, of all the reasons, not using SSAA was probably because of performance concerns.

X360 uses AF but AA. This is PC gaming, we always move ahead and not backwards, so the AF Support won't go anywhere ;) the IQ is better because ATI's drivers are finally catching up.

Xion X2
09-01-07, 11:27 AM
2. The ATI and Nvidia screenshots/IQ were said to be identical. They did look identical, but how the hell can they be identical if nvidia's AF is so much better? Is DX10/10.1 going to drop support for AF, since the X360 doesn't ues it?

Nvidia's AF isn't any better than ATI's. If it is, then the difference is so slim that it doesn't show up in most titles. Even hardocp claims this, and they've been on the G80 bandwagon ever since it was released:

There is always the question of how image quality looks between ATI and NVIDIA hardware and especially so since DX10 is so new. For this comparison we are running the game at maximum settings in DX10 mode and comparing directly between a Radeon HD 2900 XT and GeForce 8800 Ultra.

This entire section can be summed up in one sentence. There are no differences between ATI’s and NVIDIA’s current high end GPUs in Call of Juarez using DX10. We saw absolutely no differences in textures, shaders, particles, HDR and post processing effects.
http://enthusiast.hardocp.com/article.html?art=MTM4NCw3LCxoZW50aHVzaWFzdA==

nightmare beta
09-01-07, 08:10 PM
Nvidia's AF isn't any better than ATI's. If it is, then the difference is so slim that it doesn't show up in most titles. Even hardocp claims this, and they've been on the G80 bandwagon ever since it was released:


http://enthusiast.hardocp.com/article.html?art=MTM4NCw3LCxoZW50aHVzaWFzdA==

Yes it is. Beyond 3D and Techreport did tests with the g80 and it has much better af than ATI. Defaults may not be different, but comparing both chipsets' highest quality, there is a difference.

jolle
09-01-07, 09:09 PM
I have some DX10/10.1 questions.

I read a Call of Juarez feature at HardOCP, and I was kind of alarmed.

1. 4X AA is going to be a requirement for DX10.1 I heard, and hard ocp used a 4X in game setting. It made no mention of Transparency super sampling. The X360 only having just 4x MSAA without TRSS is a glaring flaw. Is microsoft going to push for just 4X MSAA without TRSS with DX10/10.1? They seem to be trying to x360-ize PC games, and it's really disturbing. So i wouldn't be too surprised if we went backwards with AA. But i'd like to know for sure, so that's why i am asking.

2. The ATI and Nvidia screenshots/IQ were said to be identical. They did look identical, but how the hell can they be identical if nvidia's AF is so much better? Is DX10/10.1 going to drop support for AF, since the X360 doesn't ues it?

1.) The requirement is that there must be support for atleast 4x MSAA in hardware.
It wont be forced on in applications tho afaik..
They wont require anyone to support TSAA, but they did specify some requirement of AA via shaders, so I guess that could allow devs to utilize similar functionality by using AA done in shaders instead of the hardware solution, if there is no such solution on the hardware.
Or if they want it enabled via the Games option perhaps.

2.) AF wont be dropped, texture filtering is a critical part of 3d graphics.
I would guess that reviewers using tools to determine AF quality will say one is better then the other.
But [H] are just looking at the image, and I think both have good enough quality AF to make it hard to determine the difference by just looking at the game.

Xion X2
09-01-07, 09:29 PM
Yes it is. Beyond 3D and Techreport did tests with the g80 and it has much better af than ATI. Defaults may not be different, but comparing both chipsets' highest quality, there is a difference.
Their tests were based on a buggy release driver that wasn't applying filtering to angled surfaces correctly. I had this show up in an early beta driver that I was using.

That problem has since been remedied, and they are identical. You even said yourself in your first post that they looked identical in more recent testing.

nightmare beta
09-02-07, 07:22 AM
in the screen shot I looked at, they look identical, but ati's shimmers while nvidia's doesn't.

Xion X2
09-02-07, 08:45 AM
in the screen shot I looked at, they look identical, but ati's shimmers while nvidia's doesn't.
There is no shimmering due to filtering problems. Shimmering has to do more with texture aliasing and alpha aliasing than it does filtering unless you're speaking in terms of Nvidia's 7-series cards of last gen.

Now the 7-series Nvidia cards had a shimmering problem. And it was often ATI's 1900-series cards that they were compared to as the beacon of perfection when it came to IQ(since they didn't shimmer at all.) And 2900XT does nothing but build on the great IQ of the 1900-series and make it a little better than it was.

But there is no "shimmering" because of filtering issues on the 2900XT. I've owned both an 8800GTX and this card I have now, and IQ is identical between them. The GTX was faster in most games, though.