PDA

View Full Version : 7800 GTX fails DX9.0c DCT?


DoomUK
10-29-05, 07:56 AM
I was browsing around on www.tech-forums.net and a member posted this in responce to a 'GTX performing badly in F.E.A.R.' thread:-

http://www.theinquirer.net/?article=27084

Basically, going by that examination the G70 is 'missing' various SM3.0 features. I'd download the thing and see for myself- if it didn't take "36 hours" to complete.

Could someone shed some light on this for me? Is this to be taken with a pinch of salt, or some seriousness?...

shabby
10-29-05, 10:11 AM
http://www.the-inquirer.com/?article=27141

OWA
10-29-05, 11:02 AM
You can also check the end of this thread where it was also being discussed.

http://www.nvnews.net/vbulletin/showthread.php?t=57731&page=11

DoomUK
10-29-05, 11:05 AM
OK, thanks for clearing that up guys :).

Lfctony
10-29-05, 03:48 PM
It's funny how the GTX is supposed to be slow in FEAR due to "bad" SM3 support when:

1. It's almost equal to the X1800XT in terms of performance with the 81.85s.
2. FEAR doesn't even use any SM3.0 features AFAIK.

I believe it's a lame PR stunt to damage Nvidia sales and pimp the X1800. (bs)

shabby
10-29-05, 04:33 PM
The gtx is not equal to the x1800xt, it surpasses it http://www.anandtech.com/video/showdoc.aspx?i=2575

Lfctony
10-29-05, 04:52 PM
The gtx is not equal to the x1800xt, it surpasses it http://www.anandtech.com/video/showdoc.aspx?i=2575

Well, tests seem to vary from site to site. :)

gstanford
10-29-05, 09:09 PM
I can't believe people are still taking this ATi FUD seriously.

This is precisely why I hate ATi, and have done so ever since their claim on Tech-TV that the 9700 is capable of running DDR-2 memory just like NV30. I could fill a football stadium with their anti-nVidia FUD.

anyhow, to address the topic of this thread, here are the replies I posted @ anandtech.

Post1:
http://www.beyond3d.com/forum/showthread.php?t=24795
The reason for failing some of the PS 3.0 tests is not the shader itself. The reason is that nVidia use a different mipmap selection than refrast. Because of this the card take the samples sometime from a different mipmap than refrast. As every mipmap have a different content (colors or rotated) they can not reach 85% identity with the refrast image. As there is no Direct3D specification about how mipmap selection have to work MS can not blame nVidia. Because of this the driver still get WHQL even if it donít pass all tests.

Post2:
One of the reasons why nVidia produce different images from refrast is because their texture filtering hardware has 8 bits of resolution (this has been a defacto standard in 3D graphics since texturing and filtering were first invented).

Refrast uses only 5 bits of resolution as do all ATi R3xx & R4xx graphics chips (this information is directly from several ATi engineers - search the beyond3d forums). I don't know if R5xx continues to use only 5 bits, but given it is based on the R3xx architecture I would expect that it does.

I'm always amused to hear ATi supporters touting better IQ when their supported hardware can't even filter textures according to a long established industry standard (but it does support microsofts reference standard, and we all know how well respected microsofts "standards" are in the wider computing industry).