fragment program precision loss?
This bit of code uses a pixel shader to do a lookup into a 3d texture based on the color at that pixel fo the 2d texture (i.e. a function based on the color of the 2d texture)
char chromaFragmentProgram="" // this is on purpose.
Assume that the 3d texture is bound with S_WRAP (etc) as GL_REPEAT (i.e. the default).
Now, initialize the 3d texture as an identity function..
// this should be bound on texture unit 1
unsigned char cubeTex;
Now, take a 2d texture that 'saturates' at least one of the color channels, and bind that as the texture on texture unit 0.
When you draw this 2d texture across the screen with the fragment program enabled, you'd expect to see the 2d texture exactly (without change), however, where the 2d texture had a pixel with an channel saturated at 255, you'll see a wrap-around w.r.t. color, and the pixel will appear black instead.
This implies there is a precision loss somewhere in the fragment program! (An annoying one)
Can you confirm this for me, or suggest a workaround?
a sample program is attached.
... it won't compile, since it is missing a 'texture' library, but you should be able to easily adapt it.
This could be related to the specification... which I havn't yet had time to go over again...
In any case, the ATI 9700 with latest linux drivers suffers from the same problem.
Annoying, just like transforming by the identity matrix, this should give you what you got...
Ah, yes, duhh..
The card in question is a GeforceFX 5600 Ultra, in an 8X AGP slot (in 8x AGP mode, not that it mattered for this bug) on a Supermicro X5DA8, running driver 4363 on RH Linux 8.0.
|All times are GMT -5. The time now is 03:05 PM.|
Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.