View Single Post
Old 09-25-02, 03:45 PM   #32
SnakeEyes
Registered User
 
SnakeEyes's Avatar
 
Join Date: Jul 2002
Location: Lake Zurich, IL
Posts: 502
Send a message via ICQ to SnakeEyes
Post

I'm not even sure that the final output is going to be too different, since (as someone else pointed out earlier) the output medium / devices can't really handle that much color (er, spectrum) anyway. The internal processing is where the number of pixels used is important, just to maintain accuracy as blends and such occur.

Also, too high an internal bit depth is just wasted bandwidth, imo. When the output pixel has to be fit into the more limited range available on, say, a computer monitor, going beyond a certain color depth internally won't produce any difference in the final adjusted output. eg. 0.898 calculated internally vs. 0.878 internally, when the output device is working in tenths, resulting in the output for both becoming 0.9. Maybe a dumb example, but I think the idea comes across (or I hope so, anyhow).

I'm not necessarily agreeing that 128bit color isn't useful or effective, because I don't really know the true limits of output devices or the current internal operations of the GPUs in question. Without having to explain this, since it's been discussed time and time again, basically it concerns rounding errors between bit depth conversions that accumulate as more passes are made through the pipeline during processing.
__________________
Snake-Eyes

Last edited by SnakeEyes; 09-25-02 at 03:51 PM.
SnakeEyes is offline   Reply With Quote