i'm coding a shadow mapping for three platforms: a Nokia N900, an iPad 2 and a common desktop.
I want to compare the precision of the depth buffer of all of them, i.e., how many bits I have for the depth buffer of each one of them, and how many bits per channel i can write on the RGBA texture during the shadow mapping generation.
I'm a little bit confused here. How do i get these numbers?
Are all of them integers?
For example, on the desktop, if I get the precision with glGetIntegerV:
and this tells me: 24.Code :GLint dbits = 0; glGetIntegerv(GL_DEPTH_BITS, &dbits); qDebug() << "z-buffer bits: " << dbits;
But when I generate my shadow map FBO, i'm using
and it works! Doesn't it mean that my z-buffer is a 32-bit floating point instead of 24?Code :glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT32F, shadowMapSize, shadowMapSize);
Now, concerning the texture generation,
If i ask the precision of the color buffer, for example, of the blue channel:
it gives me: 8.Code :GLint bbits = 0; glGetIntegerv(GL_BLUE_BITS, &bbits); qDebug() << "b bits: " << bbits;
But again, when i create my depth texture (which i'll use to write the depth on), i ask:
I'm asking you guys: is my texture 32-bit per channel floating point or not?Code :glGenTextures(1, &depthTex); glBindTexture(GL_TEXTURE_2D, depthTex); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, shadowMapSize, shadowMapSize, 0, GL_RGBA, GL_FLOAT, 0);
And most important of all, where do i find specific information on how many bits i have for each platform? which query should i use? am i querying it correctly?