I get GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT as FBO error, eventhough the textures are all equal width/height (512)
when changing the depthtexture to DEPTH_COMPONENT16, it worked and no error was thrown.
Is this my fault, or drivers
ATI 9600 mobile, which doesnt support DEPTH24 textures, but didnt complain on texture creation, and I assumed they would simply pick the next best format to use
Do you make a render buffer for the depth or a texture?
My code starts by trying to create a RGBA8 + DEPTH=32, STENCIL=0 and it succeeds.
With older drivers, I think neiher 32 or 24 depth worked. Only 16.
yes those work as well if hardware allows, I just wondered why it would return INCOMPLETE_DIMENSIONS, when sizes matched. I suspect this to be driver bug, as I haven’t gotten it on geforce8 & 6 hardware. But I wanted to make sure I haven’t “missed” something about INCOMPLETE_DIMENSIONS.
“I just wondered why it would return INCOMPLETE_DIMENSIONS, when sizes matched” sound like they simply used an error value not really descriptive for the problem.
My first reaction is that UNSUPPORTED would have been a better error value to use (to try to communicate the depth-buffer request couldn’t be satisfied), but then OpenGL hasn’t exactly got a history of descriptive error reporting. It didn’t matter that much in the early 1.x days, but with FBO it became quite clear the error reporting mechanism is … displaying its age.
Which reminds me: ARB; are there new (and more precise) errors defined for 3.x?