I work on a program which runs on different platforms (HP, SUN, Linux, IRIX). I use it though an XServer emulator (Exceed3D) from a Windows2000 PC.
All worked fine untill today when I used a Windows PC with Exceed3D and a Quadro FX 540
. Now the glReadPixels function returns strange values such as shown by the following code:
glReadBuffer( GL_FRONT );
glReadPixels ( xmouse-1 , ymouse-1, 3, 3, GL_DEPTH_COMPONENT, GL_FLOAT, z_picking );
whitch return on a well working station :
z_picking = (0.6561477, 0.6561477, 0.6561477, 0.6561477, 0.6561399, 0.6561399, 0.6561399, 0.6561399, 0.6561399)
and on the nvidia PC :
z_picking = (-1.222404e+37, -1.222404e+37, -1.222404e+37, -1.222404e+37, -5.576056e+27, -5.576056e+27, -1.222404e+37, -5.576056e+27, -5.576056e+27)
The consequence is that I can't locate a point in my openGl scene using the gluUnProject function, and I musn't change it...
What am I doing wrong, or is there a problem with the driver? Thanks a lot for the time you'll spend answering me, and forgive my poor english