glTexImage2DMultisample with GL_DEPTH_COMPONENT results in GL_INVALID_VALUE

I don’t know if this is a driver problem or my fault:

NSLog(@"GL_MAX_TEXTURE_SIZE: %d  (%d, %d)", GL_MAX_TEXTURE_SIZE, width, height);
NSLog(@"GL_MAX_SAMPLES: %d", GL_MAX_SAMPLES);
glGenTextures(1, &shadowTextureId);
glBindTexture(GL_TEXTURE_2D_MULTISAMPLE, shadowTextureId);
glTexImage2DMultisample(GL_TEXTURE_2D_MULTISAMPLE, 4, GL_DEPTH_COMPONENT , width, height, false);
[YALog isGLStateOk:TAG];

The output is:

GL_MAX_TEXTURE_SIZE: 3379  (512, 512)
GL_MAX_SAMPLES: 36183
[YAShadowMap] glTexImage2DMultisample  [OpenGL] GL_INVALID_VALUE

GL_INVALID_VALUE is called if:

[LEFT]GL_INVALID_VALUE is generated if either width or height negative or is greater than GL_MAX_TEXTURE_SIZE.
GL_INVALID_VALUE is generated if samples is greater than GL_MAX_SAMPLES.

Both causes can be excluded.

My NSOpenGLPixelFormatAttribute looks like this:

        NSOpenGLPFAMultisample  ,
        NSOpenGLPFASampleBuffers, 1,
        NSOpenGLPFASamples, 4,

and multisample is enabled:

glEnable(GL_MULTISAMPLE);
 

[/LEFT]

That isn’t how queries work.
You use GL_MAX_ enums to ask the driver what the limit is, like this:


GLint max_samples = 0;
glGetIntegerv(GL_MAX_SAMPLES, &max_samples);
NSLog(@"GL_MAX_SAMPLES: %d", max_samples);

And, because you’re using a depth format, the specification says the actual limit is MAX_DEPTH_TEXTURE_SAMPLES, not MAX_SAMPLES. And I’m guessing you’re using an ATI GPU, where you can see that the limit is “1”.

Thank You.
I confused a query with an extension availability check.
Also GL_MAX_TEXTURE_SIZE(16384) doesn’t help, because glFramebufferTexture2D then creates a GL_FRMAEBUFFER_UNSUPPORTED error.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.