Originally Posted by msxyz
I believe you are right.
But Nvidia should have dropped the buffer compression technology (as they did for the FX5200) and integrate a true 4x1 pipeline design instead. It would have helped a lot in shading intensive apps as well as improving the overall efficiency of the card.
The 14.4 GB/s bandwidth of the Ultra is a little overkill for a 2x2 card running at almost the same speed of the memory. The GeForces 3/4, in comparsion, had twice the pipelines and they were not as efficient with AA/Aniso as the NV3x line is (even the tiny 5200U manages to fill the gap with GeForce4Ti when AA/Aniso are enabled).
How ironic is that the new GeForce 6600GT has almost the same bandwidth of my FX5700U yet it manages to outperform even previous generation high end cards. Honestly, if my old faitfhful GeForce4 hadn't died, I would have waited a few months more before upgrading.
5700 is nice card (from the above details) , my mx440 is dead (when working in 3d) and thinking on replacing it with new card (and 5700 fit my budget)
ps: Need it for dual monitor setup too