View Single Post
Old 09-26-02, 02:17 AM   #35
nutball
International God of Sex
 
Join Date: Jul 2002
Location: en.gb.uk
Posts: 655
Default

Quote:
Originally posted by StealthHawk


seriously, i thought we had gone over this fact 1 million times already that the current gen R300 and NV30 won't be able to do true cinematic reality quality rendering. but the fan boys keep bringing it up
You're basically right, though I think you've missed some important words out of that assertion.

NV30/R300 won't be able to do true cinematic reality quality rendering in real-time (or anything like real-time).

Both parts have pretty much all the functionality in the pipeline necessary to render frames with the same quality as a Pixar movie[*]. All you have to do is spend the time programming them.

What they won't do is do this sufficiently fast for interactive use. OK, so Final Fantasy takes 9,000 hours or 90,000 hours to render on R300/NV30, rather than 900,000 on CPUs. That's not real-time is it? That's what the fan-boys seem to be closing their eyes, ears and brains to.
[*] Actually there is one very important piece of functionality which is missing from R300, and from what I've heard from NVIDIA people makes me think is missing from NV30. That is frame-buffer read-back into fragment programs. The lack of this makes blending in floating-point render buffers a very tricky operation. Basically this makes multi-pass rendering into floating-point buffers effectively impractical. This could well be a killer, I'm not sure how much Pixar-style renderers rely on this sort of functionality.

From what I understand frame-buffer read-back will very likely be a requirement in the OpenGL 2.0 spec (called for by the software developers, strongly resisted by the hardware developers). I presume it will be required for DX10, but that's pure speculation. If true, neither R300 nor NV30 would be OpenGL 2.0-capable. Interesting.
__________________
Got nuts?
nutball is offline   Reply With Quote