Re: ATI and AA
I posted this in another thread around here.
I see that in reading through a lot of the older posts that some guys coming from Nvidia setups are having problems getting AA working with ATI. Since I have a lot of experience w/ ATI's AA and the Catalyst in general, here's how it works:
First of all, as you've already discovered, ATI's AA is quite different from Nvidia's implementation. Enabling the Catalyst MSAA (multisampling AA) option usually does not work if there is an in-game AA option. If you want to defer to driver AA in these situations (like if you want to increase your AA from the maximum in-game 4 to 8xAA or something higher) then you must enable the "Edge Detect" option on the Catalyst and ensure that you have in-game AA turned on and the driver set to "application preference" (yes, even though you are enabling Edge Detect in the driver, the AA mode still must be set to app preference.) What this will do is apply additional AA to what's already in-game.
The way that Edge-Detect works is it uses the shaders to apply additional AA that is already being handled by the ROPs. It uses no VRAM, also, so if you're playing a game where you're already maxing your VRAM capacity with texture mods or whatever, then it's a great option to have. There are two settings for the edge-detect: 12x and 24x. A 4x sample will receive 12x AA, and an 8x sample will receive 24xAA.
Let's look at an example.
Let's say that I'm playing NFS Shift (I don't have the game yet, but bear with me for the sake of explaining) and I want to run higher AA than what's available in-game (4x.) The proper way to go about it would be to keep the in-game AA turned on at 4x, go into the Catalyst, and enable "Edge Detect." This will apply a 12xAA sample to the 4x in-game. If, on the other hand, NFS Shift had an 8xAA option and I kept that on while enforcing Edge Detect in the Catalyst, I would receive 24xAA.
The Catalyst multisampling options usually only work for games that have no in-game AA option, such as Oblivion. This is where you would enable 4x, 8x, 16x (or 12x / 24x edge-detect) in the driver. For Oblivion, I found 24x edge detect to provide the absolute best viewing experience that I've seen on either ATI or Nvidia cards. Can't be beat, in my opinion, and the performance was decent at that.
So, to summarize, learn how Edge-Detect works and how to set it appropriately to sync with your in-game settings. Once you figure out ATI's implementation, things are a breeze. I tend to like ATI's implementation more than Nvidia's because I enjoy having the Edge Detect that offers absolutely fantastic visuals at minimal performance cost since it runs entirely on the shaders and not the ROPs/VRAM.
ATI really should have some sort of formal documentation that explains how all of this works, though, as it's confusing compared to the ease of Nvidia's implementation.
For those running Crossfire, you only need one bridge. Take the 2nd off if you're using it because it can sometimes cause issues, depending on the application. Like SLI, only one bridge is needed per additional card.
Hope that helps.
i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU