PDA

View Full Version : Will the new Geforce have angle independant AF & HDR+AA?


Pages : [1] 2

KickAssCop
01-26-06, 05:52 AM
Just a question I am pondering about since I want an upgrade and can't decide if I should wait on the next geforce or go ahead and purchase the X1900 card. Will the new geforce have angle independant AF and HDR+AA on them. That's the only thing I am concerned with between ATi and nVidia for now.

Thanks for any help or information you can share.

Hyper s
01-26-06, 06:32 AM
NDA. If G70 - it's hard to belive.
if G80(NV50) - 99,99% YES and much more than this

agentkay
01-26-06, 06:36 AM
Nothing is known about that yet. AI-AF should be fairly easy to inplement and HDR+AA works with alternative AA approaches even on current NV hardware. Only people with a NDA can truely anwer if the 7900/G71 supports FP16HDR with the standard AA approach. I guess we all have to wait most likely till the end of February until specs get leaked. Lately they have been leaked around a week before launch.

Nv40
01-26-06, 04:18 PM
I dont know about ANgle independant AF.. but when it comes to HDR+AA..using FP blending. It looks like NVidia is the only one that can do it , in actual practice. Acccording to this german review .

http://www.computerbase.de/artikel/hardware/grafikkarten/2006/test_ati_radeon_x1900_xtx_x1900_cf-edition/15/#abschnitt_age_of_empires_3


.AGe of Empires3 developers claim.. that in their game they use FP16 HDR + 2.25x SSAA for Geforces cards
and int10 HDR +4xMSAA for ATI RAdeons 1800/1900.
So much advertising for a feature and not using it in games at all. perhaps it was too slow ? ,the technique used by the game for ATi doenst look diferent than What Valve developers have been doing for Sm2.0 cards. :lol:

i will not be surprised if Crytek and other developers are doing the same that AOE3.even though people where believeing it was FP16 . So that ends the Myths of the biggest disadvantages of certain green vendor.. :) Nvidia technique will work in all HDR games but need to be implemented by Developers. with Geforce's6 there wasnt enough bandwidth for both things at same time ,but now is diferent. HDR is Nvidia domain ,they allow more nicer things with HDR than ATI ,like FP filtering and precision is always high ,no tradeoff there .So HDR+AA is actually possible with high precision on Geforce's 7 , but using the more expensive and nicer looking super sampling AA modes. G71 should be fast enough to enable much higher xS modes .

agentkay
01-26-06, 04:33 PM
Yes that is correct, thats what I meant when I said "alternative AA approaches". SSAA is one of them, and so is ShaderAA, and SoftwareAA (most expensive). Good post Nv40.

Nutty
01-26-06, 04:44 PM
Yes that is correct, thats what I meant when I said "alternative AA approaches". SSAA is one of them, and so is ShaderAA, and SoftwareAA (most expensive). Good post Nv40.

Please explain what you mean by ShaderAA and SoftwareAA ?

agentkay
01-26-06, 05:16 PM
I heard that the Nalu demo used shaders to generate AA. You couldnīt force it in the CP but in the args.txt. Tertsi mentioned it here (http://www.nvnews.net/vbulletin/showthread.php?p=791137#post791137) and while he didnīt specifically said that its Shader AA, I was assuming that he was pointing towards that shaders controlled or generated the AA.

Software AA was just a theoretical possibilty. Canīt explain how it would have to be done, but I was thinking that it could be a software controlled and applied AA layer, maybe like 2D text/UIīs are generated in FPS or 3D games in general. I know too little about this how it could work, but I think it should. After all AA in Adobe Photoshop for instance is a pure software/non-hardware accelerated type of AA as well.

jolle
01-26-06, 05:19 PM
yeah but the pixels in photoshop are CPU generated, so its a bit different.

agentkay
01-26-06, 05:28 PM
yeah but the pixels in photoshop are CPU generated, so its a bit different.

Yes of course. Itīs the best I could explain the term that I used (Software AA). :)

KickAssCop
01-28-06, 02:11 PM
So are you trying to tell me that Farcry is not using FP16 + AA for ATi?

MUYA
01-28-06, 02:48 PM
So are you trying to tell me that Farcry is not using FP16 + AA for ATi?
I thjink Tertsi mentioned that Far Cry devs used an alternate method of employing HDR with Far Cry rather than fp16. I think ....

Nv40
01-28-06, 05:47 PM
So are you trying to tell me that Farcry is not using FP16 + AA for ATi?

Not sure , but the fact is Facry support INt HDR +AA for Sm2.0 cards hardware. So nothing stop ATi to fallback to the lesser quality technique behind the community knowledge in every game that use HDR , (Just like they do in AOE3.)since its common knowledge Nv sm3.0 cards only use its maximun HDR quality whenever its supported by a game. SPlintellCell 3 also support INT-HDR+AA (just like Hl2) that works for all Sm2.0 cards trading precision for speed at request of ATI. So many games cant be coincidence .What im saying is that you might be asking for something that doesnt exist in the place you think exist.that is questionable whether or not ATI support ,but even if supported ,could end being used only for marketing and never see the light in future games ,since they use instead the lower quality mode that developers code at reque$t of ATi that works for Sm2.0 cards. HL2 lost coast and AOE3 for example.What is odd is that INT+AA in Hl2 and SC3 was done because of X800's limitations ,for AOE3 it was done exclusivily only for ATI sm3.0 cards. and the only reasons i can think of for happening that is either it was too slow with FPHDR for ATI or simply their implementation is very limited or doesnt exist at all. For certain the only thing that is sure is that Nv Geforce6/7's can do FP16HDR+AA (and INT+HDR+AA) in many ways as explained in this thread with diferent AA ,for some techniques it cant be selected throught the NV control panel ,its need to be done internally by the game. And that G70's finally have the speed to use its max quality HDR with AA in more than tech demos ,but also in games.

fivefeet8
01-29-06, 02:54 AM
.AGe of Empires3 developers claim.. that in their game they use FP16 HDR + 2.25x SSAA for Geforces cards
and int10 HDR +4xMSAA for ATI RAdeons 1800/1900.
.

I was looking for confirmation of your post and I came across 2 developer posts in the AOe3 forums. The actual Super Sampling they use for FP16 filtering/blending HDR is 1.25xSSAA(low) and 1.5xSSAA(high) for Nvidia cards.

http://forum.agecommunity.com/ibb/posts.aspx?postID=19361&postRepeater1-p=1#19594

The other developer post indicated that ATi's cards don't support blending in FP16 HDR render targets and they used Int10 HDR with 4xMSAA. But isn't FP16 blending a supported feature listed on the ATi x1xx cards? Maybe he's talking about ATi's R4xx cards. But then why would he since HDR is disabled on all non SM3 cards for the game. Either that, or Int10 HDR(high) looks horrid in AOE3 when compared to FP16 f/b SM3(very high) mode.

http://forum.agecommunity.com/ibb/posts.aspx?postID=13972&postRepeater1-p=1#14674

Henno
01-29-06, 09:46 AM
I dont know about ANgle independant AF.. but when it comes to HDR+AA..using FP blending. It looks like NVidia is the only one that can do it , in actual practice.

3DMark06 uses FP blending HDR, my XT can do it with FSAA simultaniously. In Serious Sam, according [H], only Ati can do HDR+FSAA: http://www.hardocp.com/article.html?art=OTUzLDY=

About Farcry; i see no difference in HDR quality with my former 6800GT, it looks exactly the same. But i don't have the 6800GT anymore, so i can't check it out to be sure.
But i would say check out these Ati screenshots, and compare them with your current Nv card:

http://www.rage3d.com/board/showthread.php?t=33837928

the fact is Facry support INt HDR +AA for Sm2.0 cards hardware

Since when? I couldn't get any form of HDR working on my X800 in Farcry.

Nv40
01-29-06, 05:37 PM
I was looking for confirmation of your post and I came across 2 developer posts in the AOe3 forums. The actual Super Sampling they use for FP16 filtering/blending HDR is 1.25xSSAA(low) and 1.5xSSAA(high) for Nvidia cards.

http://forum.agecommunity.com/ibb/posts.aspx?postID=19361&postRepeater1-p=1#19594

The other developer post indicated that ATi's cards don't support blending in FP16 HDR render targets and they used Int10 HDR with 4xMSAA. But isn't FP16 blending a supported feature listed on the ATi x1xx cards? Maybe he's talking about ATi's R4xx cards. But then why would he since HDR is disabled on all non SM3 cards for the game. Either that, or Int10 HDR(high) looks horrid in AOE3 when compared to FP16 f/b SM3(very high) mode.

http://forum.agecommunity.com/ibb/posts.aspx?postID=13972&postRepeater1-p=1#14674

You have done excelent questions.. :)
You have AOE3 developers? saying something contrary to what people have been believing ATI can do. Im not a developer so cant tell for sure . But you can be sure that ATI is well aware of what their hardware support or not ,and they have close relationships with developers about the ins are out of their hardware. I dont think HDR+AA in AEO3 happened by accident in that game. Some kind of support from Nvdia and ATI was required. My curiosity is now greater after reading that. is this straight from the AOE developers ?..


This is incorrect. Nvidia cannot do hardware multisampling on floating point surfaces, but we do supersampling for antialiasing, so it does in fact work in combination with HDR. On ATI, we must use 10 10 10 2 surfaces, because they do not support blending on 16 16 16 16, but they do multisampling on 10 10 10 2 in hardware.

Hope that clears things up.


when it comes to the AA modes used by Nvidia in that game.. it have been said by coders that .. the antialiasing geforce6/7 use with FP16HDR is an effective 2.25x (1.5x in both directions) supersampling. with faster hardware /or SLI ,i will not be surprised to see higher xSS modes in games. What is certain is that Nv already can do any kind of HDR with AA in many ways without any modification to their hardware. it have been done in their Sm3.0 demos and will be done in more Sm3.0 games. In games where INT+AA are used ,any SM2.0 hardware will do the job .

Nv40
01-29-06, 05:47 PM
3DMark06 uses FP blending HDR, my XT can do it with FSAA simultaniously. In Serious Sam, according [H], only Ati can do HDR+FSAA: http://www.hardocp.com/article.html?art=OTUzLDY=

About Farcry; i see no difference in HDR quality with my former 6800GT, it looks exactly the same. But i don't have the 6800GT anymore, so i can't check it out to be sure.
But i would say check out these Ati screenshots, and compare them with your current Nv card:

http://www.rage3d.com/board/showthread.php?t=33837928



Since when? I couldn't get any form of HDR working on my X800 in Farcry.

Well perhaps i made a mistake with FArcry,whether it support int+AA or not. They have been adding so much patches to that game for SM2.0 and Sm2.0+ and SM3.0 that is hard to be updated with them. :) Hl2,SC3,AO3 are the official ones for now.When it comes to [H] they only repeat what the marketing slides that comes with the hardware they are reviewing .they review hardware from a gamers point of view not from a scientific one. nothing wrong with this. but is not very technical ,but more about gamers feelings and experience. and about Farcry IQ ,the only thing i remember of being said is that HDR between NV and ATI is not the same technique. and people have claimed to see diferences in their IQ.but dont know about the accuracy of those reports.. It would be interesting a deep investigation about this.

Henno
01-29-06, 06:22 PM
and about Farcry IQ ,the only thing i remember of being said is that HDR between NV and ATI is not the same technique.

Where did you read that? The stories i've read in various threads @ Beyond3d say it's the same FP16 HDR, the HDR+AA Ati patch was a 5 minute job for Crytek. (notice too that the patch is a small download, only 6,4MB. A new technique should be substantially larger i suppose).

Nv40
01-29-06, 06:47 PM
Where did you read that? The stories i've read in various threads @ Beyond3d say it's the same FP16 HDR, the HDR+AA Ati patch was a 5 minute job for Crytek. (notice too that the patch is a small download, only 6,4MB. A new technique should be substantially larger i suppose).


Cant remember exactly where ... aparently there are indeed IQ diferences..noticeable between NV and ATI using pure HDR
in Facrcy..(no AA) .iF this observation is correct then unless its abug ,it points to a diferent HDR technique between NV and ATI. Any one with both cards can easily confirm or deny this.

Redeemed
01-29-06, 06:47 PM
So, I'm curious- how well could the 5950U do HDR and AA?

in games where INT+AA are used ,any SM2.0 hardware will do the job .


I own a 5950U, if you could give me a list of some games that use this method for certain, I could do some testing to find out.

Thanks. ;)

Nv40
01-29-06, 06:50 PM
So, I'm curious- how well could the 5950U do HDR and AA?



I own a 5950U, if you could give me a list of some games that use this method for certain, I could do some testing to find out.

Thanks. ;)


of course you need FAst Dx9 hardware too.. THe feature will not be enabled if the game runs very slow already without HDR .:)

Redeemed
01-29-06, 07:52 PM
The 5950U wasn't a slow card. Granted, it did lack in shader performance, but most other areas it was extremely fast for its time. Sure, the 9800XT did outpace it, but not exactly by light years.

In fact, if I recall, AthlonXP1800 was able to make HL2 run using full DX9 (save SM3.0 specs) on his 5950U with next to no performance drop. I haven't tested this myself yet, but if true I'm sure by lowering other settings a 5950U could do HDR just fine.

Henno
01-29-06, 07:54 PM
Any one with both cards can easily confirm or deny this.

OWA, where are you? :)

Nv40
01-29-06, 08:41 PM
The 5950U wasn't a slow card. Granted, it did lack in shader performance, but most other areas it was extremely fast for its time. Sure, the 9800XT did outpace it, but not exactly by light years.

In fact, if I recall, AthlonXP1800 was able to make HL2 run using full DX9 (save SM3.0 specs) on his 5950U with next to no performance drop. I haven't tested this myself yet, but if true I'm sure by lowering other settings a 5950U could do HDR just fine.


Yes .. i agree with you . But since Nvidia moved quickly to their Sm3.0 cards ,Developers focused in their new hardware .
ATI have been more active with Sm2.0 support ,(because of not having anything better for a long time.) And this result in more support for their Sm2.0 lineup, aside from its performance advantages.

PikachuMan
01-29-06, 08:42 PM
and about Farcry IQ ,the only thing i remember of being said is that HDR between NV and ATI is not the same technique.

Maybe you were remembering the tech demo (http://www.ati.com/gitg/promotions/crytek/) Crytek did for ATi? The HDR there worked on X800 cards, even though the method implemented in FarCry itself only works on Geforce 6/7 cards (and now the X1 series).

fivefeet8
01-29-06, 09:17 PM
About Farcry; i see no difference in HDR quality with my former 6800GT, it looks exactly the same. But i don't have the 6800GT anymore, so i can't check it out to be sure.
But i would say check out these Ati screenshots, and compare them with your current Nv card:

http://www.rage3d.com/board/showthread.php?t=33837928



It's hard to compare HDR quality without being the exact same place to take the screenshots. The HDR itself kinda changes depending on where you're standing and looking. Here is a closes comparison I could make from the RAge3d screens using my 6800ultra @440/1230.

Original Far Cry Screen from Rage3d showing HDR+AA:
http://img53.imageshack.us/img53/1281/1l0dl.th.jpg (http://img53.imageshack.us/my.php?image=1l0dl.jpg)

Here's one from my 6800ultra:
http://img218.imageshack.us/img218/9901/farcry20060129121943878cd.th.jpg (http://img218.imageshack.us/my.php?image=farcry20060129121943878cd.jpg)

I got as close to the original screen as I could. It seems that HDR quality is pretty much the same. A few differences, but that could be down to the small descrepency in viewpoint. It's interesting though, that an x1800xt @690/1600 gets only 2 fps more than my 6800ultra @440/1230. I know it's with 4xAA, but still, that's a huge performance deficit for the x1800xt. It looks as if only the x1800xt or x1900xt will be able to do HDR+4xAA at 1280 and higher. The x1800xl and lower will have to settle for no AA HDR at 1280x1024, or 4xAA + HDR at 1024x768.