TEXTURE_COMPARE_FUNC broken on Mac for 32bit prog?

Hi all,

I developed an application in the recent time on a Mac 10.6.5 that compiled in 64bit. However when I compiled it in 32bit my shadow maps were broken. After some research for the error I found it in the texture compare function. I create my depth texture the following way:


  // create a separate depth texture 
  glGenTextures(1,&textures[DEPTH_TEX]);
  glBindTexture(GL_TEXTURE_2D, textures[DEPTH_TEX]);
  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
  glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP );
  glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP );
  glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_COMPARE_MODE,GL_COMPARE_R_TO_TEXTURE);
  glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_COMPARE_FUNC,GL_LEQUAL);


  // No need to force GL_DEPTH_COMPONENT24, drivers usually give you the max precision if available
  glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, _width,_height, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0);

The shadow maps itself are computed in a GLSL shader where I use the built-in function “shadow2D(…)”. In 64 bit mode everything was fine, but when I moved to 32 bit this function always returned 0. In order to check it I have replaced GL_LEQUAL by GL_ALWAYS or GL_NEVER and looked for the change in my 32bit and 64 bit programs. Normally GL_ALWAYS force shadow2D to return 1. On the 64bit version the behavior was like in the spec, but on the 32 bit it always returned 0.

Does anybody knows a solution for that?

Oh by the way some informations about the system:
Mac 10.6.5
NVIDIA GT 120
QT-Application

testet my code on a Windows 7 with - 32bit and still have the same issue. Arghhh!

At least I know there is something wrong in code :S

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.