View Single Post
Old 09-25-02, 01:45 PM   #31
Bigus Dickus
GF7 FX Ti 12800 SE Ultra
 
Join Date: Jul 2002
Posts: 651
Default

Quote:
Originally posted by Nv40


the table from tech report is not wrong ,the info posted there
from Radeon9700 and Nv30 comes directly from ATI and Nvdia respectively ,the guy who has done a terrific profesional and unbiased review asked directly to ATI and NVIDIA and the table of specs outlines what Nvdia and ATI has told to him ,much better info that was is posted in Beyondfans3d who everyones knows which company most members and reviewers are biased..
Um... no. The NVIDIA numbers in that table were from the CineFX paper, which also included the "R300" numbers. Those "R300" numbers that NVIDIA used in the CineFX papers were simply the DX9 requirements, which it was later found that ATI had surpassed just as NVIDIA had. The numbers are wrong, period.

Quote:
as i said ATi demos were 64bits! very weird right ?
probably because they were short on time when designing the demos
or because radeon9700 only support 64bits not 96bits that they claim
or because there were not enough power in the card to push a demo in realtime in more than that 64bits
Or, probably, because there is little if any visual difference in final image quality between 64 bit internal precision and 96/128 bit internal precision for the number of shader ops used in that car demo? Another reason why the 128 bit vs. 96 bit argument is rather pointless. Little if any visible difference, at least for any application that will be run during the life of the cards. Besides, I believe that the texture read stage where the 9700 does use 128 bit is a key point in the process where accuracy degrades.

Quote:
but just for reference lets see
what ati have done in their 64bits demos ..

http://www.tech-report.com/etc/2002q...s/index.x?pg=3

and now what Nvidia claim Nv30 can do..

http://www.tech-report.com/etc/2002q...s/index.x?pg=6
So you're comparing a screenshot that was actually rendered on one piece of hardware, and comparing it against an off-line rendered image that NV claims they might come close to... if the NV30 ever shows up, that is.

Quote:
pixar use 64bits in their productions , but movies are very diferent than Games .the fact is that cinematic quality ->in realtime needs more precision more colors and more acurracy than Movies ,
which are not ->real time!! see ? they are prerendered shots ,
thats why john carmack asked for 64bits of colors.. see ?
OMG, I don't even know how to address that statement. How about this: when a movie is viewed, the frames are presented in real time to the viewer, regardless of whether the calculation of each frame was done in the interval occupied by each frame or not. No, that's probably too complex for you. How about this: games and movies alike present the viewer with a sequence of frames, and (assuming the framerate is the same) the depth of color precision will have an identical effect as perceived by the viewer for both. No, that's probably too complex as well.

How about this: yOu R teh retarted... DUH!!!

Quote:
those pictures you see in the movie are not real time ..see?
Yes, uOy R teh retetarted. When is the last time you watched a movie frame by frame... slowly? How about a game? It's the SAME DAMN CONCEPT.

Quote:
but for Cinematic quality games or cinematic real times demos you will need no less than Nvidia Cinefx pixelshaders/vertex shaders precision .
Says who? Oh, you're an expert on 3d rendering techniques... right? Please. Try to wrap your head around this: if the original movie scene was rendered by Pixar at 64 bits precision, then it will look identical if rendered by a turing equivalent 64 bit capable graphics card. IDENTICAL.
Bigus Dickus is offline   Reply With Quote