to mr. biggus
the table in tech report as the
authors say comes directly from WHat Nvdia and ATi told him
what part of this ,you dont understand ?
,is not that the author were so stupid to only ask NVidia about
what Radeon9700 really is . stay in Beyondfans3d gurus
where they claim Nv30 is 128 bits bus and will be delayed to february because they say it..
the fact is you dont like the superiority of Nv30 CineFx ,
but it dont change the truth ...period
Originally posted by jbirney
You really dont have a clue on how complicated movie scense are then if you make a statement like that. Taken from;
Here are some of the stats on the FF movie:
Number of Sequences = 36
Number of Shots = 1,336
Number of Layers = 24,606
Number of Final Renders (Assuming that > everything was rendered once) = 2,989,318
Number of Frames in the Movie = 149,246
Average number of shots per sequence = 37.11
Average number of rendered layers per shot = 18.42
Average number of frames per shot = 111.71
Estimated average number of render revisions = 5
Estimated average render time per frame = 90 min
Shot with the most layers = (498 layers)
Shot with the most frames = (1899 frames)
Shot with the most renders [layers * frames] = (60160 renders)
Sequence with the most shots = (162 shots)
Sequence with the most layers = AIR (4353 layers)
Sequence with the most frames = (13576 frames)
Using the raw data (not the averages) it all adds up to 934,162 days of render time on one processor. Keep in mind that we had a render farm with approximately 1,200 procs.
And this is going to run even close to real time on the nV30? Rrrriiiiigggghhhhtttttt
CPus are Hundreds of times slower computing graphics compared with
Video cards ,and with Nv30 it could be thousands of times slower .
have you ever used your Computer for anything diferent that gaming ?
something like 3dgraphics ? i can preview with MAYA (a 3dsoftware rendering package) some scenes in real time with my Geforce4 ,at the near same quality!!!! (sometimes better!!)that the final SOFTWARE RENDERING!! image!! in much much less time , 20frames per seconds(geforce4) vs 1 frame per minute!(AtlonXP1900+) ,thats huge diference...
(when heavy reflections are not used )its funny to see in some scenes ,how great my scnes looks in my Geforce4 in harware
rendering mode in real time (24 images per second) vs software rendering made by my ATLonxp1900+ which take minutes to render
just ONE IMAGE!!!
thousands million dollars Renderfarms-cpus vs just one video card? sounds crazy right? but this is something that is goint to
happen in a couple more years .Cpus Software rendering will be obsolete for graphics effects ,video cards alone will do that.
you mised my point ,im not saying Nv30 will do the 1h 30min the Full movie in real time ,that will require a lot more than a computer + video card ,and millions of hours of Nvidia Cg programers recreating
shaders used in the Movie in the Nv30.
what im saying NV30 will be able to re-create a very short cut scenes
of the movie Final fantasy -the spirits withing in real time ,
(like they have already done in a geforce4 see
like ATi have done with the Lord of the rings version displayed
in the radeon9700 but rather than
1/10 the quality of the movie ,it will be 9/10 in the Nv30
thanks to its more powerfull Pixels shaders ,color accurracy and
most important perfromance .
Nv30 and radeon9700 have enough precision to render a still shot of Toystory2 (64bits) see?
the color precision IS THERE
,what is not there ,is the performance needed to render 2+ millions of polys scene with that kind
of quality , in just one computer with one video card see?
Nv30 Cinefx is well know to have even more pixel shaders presicion
to render ANYTHING!! read that! any still shot from any CG movie ever made you can name! see ? what is not there is the performance
to move any big 10/10 scene of a movie like final fantasy! at least not in real time which is 24fps ..
to move a 10/10 scene of CG movie like FInal FAntasy in real time
we are as close as the end of 2003 (by looking the coments of JC)it would be possible by the end of next year with the latest technology avaible from Cinefx harware from Nvidia and probably ATi ,which will be an the Nv35 or R350 in multi chip configurations for profesional boards.. see?
take your time to read this article ,and see by yourself how
close the Video card industry is ,to be at the level of latest
CG holywood movies.
So ATI's R300 and NVIDIA's NV30 will comprise the first generation of dedicated graphics chips capable of cinematic quality shading. They won't be capable of rendering all of the best effects seen in recent movies with all of the detail in each scene in real time, but they should be able to deliver some exceptionally compelling graphics in real time. Gamers had better hold on to their seats once games that use these chips arrive. And these chips will challenge entire banks of servers by rendering production-quality frames at near-real-time speeds. Graphics guru John Carmack's recent Slashdot post on the subject anticipates replacing entire render farms with graphics cards within 12 months:
Video cards already can match some shots done by Pixar
the good thing is Nvidia Nv30 cinefx already has the image quality needed to RECREATE any still picture of any computer graphics movie ,ANY!!! wihout any loss in IQ not like the R300
however when you are going to show a demo animation that will run in real time (24fps),obviously if you dont have the performance to move so much detail at decent frames you will need to lower the image quality details to prove the power of your video card in real time animations.. see?
conclusion of Nv30 superiority..
it would only means higher performance for gamers in todays games ,
drivers that works flawlessly out of the box ,and the best video card for Doom3 like JC have said..
but for the profesional computer graphics Artist
like 3danimators it means support for much more longer pixel shaders per pass + 128bits colors full time precision + greater performance
with solid stable drivers ,and it would mean much much more
for top gamedevelopers who already has stated their next projects
will use the Nv30 as standar
but like a wise men once told A picture is worth a thousand words
, with nvidia Nv30 demos alone will show clearly the superiority of its hardware technology ,like the title say
.. MUCH more than the R300