PDA

View Full Version : fx5200 emulates dx9, fx5600 slow fsaa


Geforce4ti4200
07-15-03, 03:43 AM
One of my friends says the fx5200 does not support hardware level dx9, it emulates it software, thus making any dx9 games really slow.

Another friend said the ti4200 eats the fx5600 without and with fsaa, only the radeon cards have fast fsaa and the fx5800 or higher.

is the 256mb fx5600 even worth $129 used?

CaptNKILL
07-15-03, 03:52 AM
Originally posted by Geforce4ti4200
One of my friends says the fx5200 does not support hardware level dx9, it emulates it software, thus making any dx9 games really slow.

Another friend said the ti4200 eats the fx5600 without and with fsaa, only the radeon cards have fast fsaa and the fx5800 or higher.

is the 256mb fx5600 even worth $129 used?

Well, I got a 5600 256Mb from EVGA as a replacement for my dead 4400. Its ok I guess, but I think you could get a lot better if you just kept the $130 and put it toward a 9500 Pro or 9600 Pro. Or even a 5600 Ultra if you cant stand ATI. The 256MB 5600 isnt quite as fast as my 4400 some times, other times its faster.... where as im sure the other cards are faster all the time. And about the 4200 eating the 5600 with and without AA, I dont think so, at least not ALL the time. Mine runs better than my 4400 when I use AA\AF. And the 4400 was oced quite a bit, which is a LOT faster than a 4200.... Id say that your freinds were reading things second hand from reviews. I know first hand, the 5600 isnt bad, but it isnt amazing either.

-=DVS=-
07-15-03, 03:57 AM
dunno about 5600 , but yes 5200 is very cut down version without alot of features :o its like GF4 MX not on same lvl as GF4 Ti

aapo
07-15-03, 04:50 AM
Originally posted by Geforce4ti4200
One of my friends says the fx5200 does not support hardware level dx9, it emulates it software, thus making any dx9 games really slow.

Not possible. That's a funny claim.
The DX9 software reference rasterizer takes hours to draw one frame! Even though the reference rasterizer is an interpreter (instead of compiler) and it's probably far from the optimal even as such, I doubt that nVidia could make software version faster than 1/10 fps.

Besides, if CPU-driven pixel shaders would be utilized, the frame would have to be first drawn in the GPU to some extent and then moved through AGP bus to the main memory for the CPU to calculate the pixel shading. The AGP bus would be completely thrashed.

Another variant would be to draw the image completely with the CPU with JIT assmebler shaders as demonstrated by Nick at B3D (the link, although it's interesting only for serious programmers). But if this would be the case, the graphics card wouldn't be used at all, and nVidia could put e.g. old Tridents into an FX5200 package and sell them. :afro2:

link: http://www.beyond3d.com/forum/viewtopic.php?t=5610

Uttar
07-15-03, 06:03 AM
Actually, there are some insane stuff going on for the NV31 & NV34 I think - in relation to Vertex Shaders.
There is hardware support for them, of course, but some stuff is done on the CPU I believe. What exactly is hard to tell, and I'm pretty sure that conspiracy theories are not in order :)


Uttar

aapo
07-15-03, 06:23 AM
Originally posted by Uttar
Actually, there are some insane stuff going on for the NV31 & NV34 I think - in relation to Vertex Shaders.
There is hardware support for them, of course, but some stuff is done on the CPU I believe.

Yes, with vertex shaders you are certainly right. Vertex shading with software gives only a mediocre perf hit (at least in FX5200 class hardware). But the original claim was that all of the new DX9 functions in FX5200 would be rendered with software. I think the floating point pixel shaders are the heaviest addition in DX9 over DX8, so I considered only them.

BTW: Interesting information. Is there a thread discussing the matter somewhere?

StealthHawk
07-15-03, 07:14 AM
Originally posted by Geforce4ti4200
One of my friends says the fx5200 does not support hardware level dx9, it emulates it software, thus making any dx9 games really slow.

Untrue. But the gfFX5200 is slow when running DX9 shaders anyway ;)

Another friend said the ti4200 eats the fx5600 without and with fsaa, only the radeon cards have fast fsaa and the fx5800 or higher.

Untrue. The gfFX5600 should be faster than the gf4Ti4200 with FSAA. Without FSAA, the gf4Ti4200 will win. Also, the gfFX5600 is most likely faster than the gf4Ti4200 for two reasons. 1- NVIDIA optimizes AF for the gfFX line. 2- gf4Ti cards take a huge performance hit from enabling AF compared to other cards in D3D.

is the 256mb fx5600 even worth $129 used?

That depends on what card you have. If you have a gf4, then no.

extreme_dB
07-15-03, 08:23 AM
Originally posted by Geforce4ti4200
is the 256mb fx5600 even worth $129 used?

Not in terms of performance, but the market value is a lot higher so it could be sold at a profit. :D

Geforce4ti4200
07-15-03, 08:08 PM
Ok thanks guys for replying.


ti4200- I have this now and no dx9 is gonna limit or prevent me from running games in the future, perhaps as early as 2005 when 90% of gamers have a dx 9 card.

fx5200- too slow for 1600x1200, nuff said

fx5600 256mb- alot like a stock ti4200 if the fx5600 is overclocked, so ill lose very little performance over my ti4200 but I get twice the ram, less hit with aa and af and of corse dx9

radeon9700 NON pro-slowest decent card from ati but overkill accroding to my friend, dont wanna spend $220 on one of those.

9600 pro- slower than a ti4200 and costs about $160 too........


I could always try out the fx5600, if I hate it that much ill resell it and most likley will not take a loss. also I may just get a ti4600 and if that thing overclocks well, ill be at 15k marks or very close to the performance of a radeon9700 NON pro :)