Depth Tex GL3.0 GLX_CONTEXT_FORWARD_COMPATIBLE_BIT

Hi there,

I’m trying to create a depth texture on gl 3.0,
everything works fine if I do not use GLX_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB to create gl 3.0 context, but with the flag active, the following code gives me an gl error


glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D, id);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

checkErrors("Before: ");
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, 256, 256, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);
checkErrors("After: ");

After: GL_INVALID_ENUM : An unacceptable value is specified for an enumerated argument. The offending function is ignored, having no side effect other than to set the error flag.

I’ve checked the gl spec and I don’t think an invalid enum should be thrown with the forward compatibility contexts,

Am I wrong?, if so, how can i create a depth texture with a forward compatibility context

I’m running ubuntu with the following gpu
Graphics report :
Renderer : GeForce GTS 250/PCI/SSE2
Vendor : NVIDIA Corporation
Version : 3.0.0 NVIDIA 185.18.36
Shading Language Version : 1.30 NVIDIA via Cg compiler

thank you
best regards
Rui

try GL_FLOAT instead of GL_UNSIGNED_BYTE.

Hi there,

I’ve tried with GL_GLOAT, it did not work,

thanks

Looks good to me. I have almost exactly this working on NVidia on Linux. You can try GL_UNSIGNED_INT, but that’s not it.

You can always update your driver to the latest, but I doubt that’s it either.

My guess is that the invalid enum has something to do with the “context” in which you’re calling this call. Do you have a 2D texture bound, or maybe is it a RECT instead. Do you have an active texture unit. Have you maybe already assigned texture data to this texture and you’re not aware of it.

So I’d back these few lines of texture creation code out to a small test program and establish what is and isn’t causing a problem there, and then go tracing things you might be doing wrong in your program.

Here’s one such test program. This works fine (prints nothing) here on NVidia 190.32 on Linux:

#include <stdio.h>
#include <stdlib.h>
#define GL_GLEXT_PROTOTYPES
#include <GL/gl.h>
#include <GL/glut.h>

main( int argc, char *argv[] )
{
  glutInit( &argc, argv );
  glutInitDisplayMode( GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB ); 
  glutCreateWindow( "GL Window" );

  GLuint tex;
  glGenTextures( 1, &tex );
  glBindTexture( GL_TEXTURE_2D, tex );
  glTexImage2D( GL_TEXTURE_2D,
                0,                     // Level     
                GL_DEPTH_COMPONENT24,  // Internal format
                256, 256,              // Res
                GL_FALSE,              // Border    
                GL_DEPTH_COMPONENT,    // Format    
                GL_UNSIGNED_BYTE,      // Type      
                0 ) ;

  if ( glGetError() != GL_NO_ERROR )
    printf( "Oops!
" );
}

Thank you,

I’m not using GLUT, I’m using X directly,

I solved my problem today, it was a driver bug,
it seems 185 driver does not implement opengl 3.0 strict (GLX_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB) correctly,

I have upgrade ubuntu nvidia driver to 190.53, and it works now,

thank you

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.