Originally posted by Bigus Dickus
A picture is worth a thousand words. Show me a pair of pictures where one was rendered at 96 bit precision, and the other at 128 bit precision, and show me the visual difference. Go ahead... I'm waiting. I don't think Pixar even renders movies at 128 bit precision, but 96 and downsamples to 64 bit for frame storage after rendering.
That table from tech-report is from August 9th, and is simply wrong. NVIDIA supplied their "best guess" as to the R300's capabilities, and it turned out that the R300 was much more powerful/flexible than NVIDIA had believed. Why don't you find some source from, say, the last month or so?
the table from tech report is not wrong ,the info posted there
from Radeon9700 and Nv30 comes directly from ATI and Nvdia respectively ,the guy who has done a terrific profesional and unbiased review asked directly to ATI and NVIDIA and the table of specs outlines what Nvdia and ATI has told to him
,much better info that was is posted in Beyondfans3d who everyones knows which company most members and reviewers are biased..
quote from techreport..
As I said before, I've read up on both chips and talked to folks from both NVIDIA and ATI in an attempt to understand the similarities and differences between these chip designs. Both designs look very good, and somewhat to my surprise, I've found very few weaknesses in the ATI design, despite the fact it's hitting the market well before NVIDIA's chip. There are some differences between the chips, however, and they point to different approaches taken by the two companies. Most of my attention here is focused on the pixel pipeline, and the pixel shaders in particular, because that's where the key differences seem to be
A picture is worth a thousand words. Show me a pair of pictures where one was rendered at 96 bit precision, and the other at 128 bit precision, and show me the visual difference.
i agree too, a picture is worth a million words , but you will need
to wait for Nv30 demos and see what kind of quality and precision you can create with its greater pixelshaders technology in CineFx
as i said ATi demos were 64bits! very weird right ?
probably because they were short on time when designing the demos
or because radeon9700 only support 64bits not 96bits that they claim
or because there were not enough power in the card to push a demo in realtime in more than that 64bits ,the later seems to be the
what really happened.. ,because there is no sense to advertise a card
with 96bits colors the later show 64bits demos
but just for reference lets see
what ati have done in their 64bits demos ..
and now what Nvidia claim Nv30 can do..
wow! just say it!!! ....impressive ? hehe
i tell you that if Nvidia backup and demostrate in real hardware his claims showing a demo with that kind of quality ,there will be no single human in the planet in the 3d profesional industry
like CAd egineers ,animators or gamedevelopers who will not RUN! a buy an Nv30 . hehe and even gamers will not resist the Nv30
just to have the most powerfull videocard in the planet .
if the Nv30 is Much much faster than the radeon9700pro in direcx9 (which i believe) and have that kind of quality Nvidia is claming
,i have no doubts that it will be possible to see true Cinematic
quality for the first time in the computer industry and in real time!
pixar use 64bits in their productions , but movies are very diferent than Games .the fact is that cinematic quality ->in realtime needs more precision more colors and more acurracy than Movies ,
which are not ->real time!! see ? they are prerendered shots ,
thats why john carmack asked for 64bits of colors.. see ?
a still image photo in 32bits can look as good as another one in
128bits ,compare this 32bits prerendered picture in your RADeon2
with the sport car demo made by ati in the radeon9700 in 64bits
64bits are more than enough for pixar shots!! prerendered
in movies because those pictures you see in the movie are not real time ..see? the quality in computer graphics movies in holywood are hand tweaked,with paint programs like photoshop or post processing programs like Shake .you will be amazed by the poor quality and graphics errors sometimes the original shots results rendered in computers vs the final shots that you see in the movie.. see?
but for Cinematic quality games or cinematic real times demos
you will need no less than Nvidia Cinefx pixelshaders/vertex shaders precision . hopefully ATi will do something in the future R350?,to match Nvidia Nv30 cinefxquality and its possible higher performance..
i predict that Nv30 will be able to show Final fantasy as close as 9/10 of the real thing in real time which will be impressive ,
not because of image quality because the Nv30 has no loss in image quality but because the huge performance needed to do that..
but surely an in 10/10 in the future with a more powerfull Gpu ,like Nv35!
like the interview with Nvida CEO ..
where the inteviewer asked ...
how much diference we will see between Nv30 cinematic quality
and what we have today..?
->it will be the diference between what we see in movies
and what we see in games