NVidia: 4096^2 FBO not working on GF7950GTX

I’m creating a GL_RGBA16F texture, 4096*4096, mipmapped.

After that, for each mipmap level, I’m creating a FBO and attach the level to the FBO.
This works for all levels down to level 1.
However, for level 0 (4096x4096), glCheckFramebufferStatus suddenly returns GL_FRAMEBUFFER_UNSUPPORTED.

GL_MAX_TEXTURE_SIZE is 4096
GL_MAX_RENDERBUFFERS_IZE is 4096
GL_MAX_VIEWPORT_DIMS is 4096x4096

So, I’m assuming a bug in the driver.
This was tested on a machine with Forceware 258.96 installed on a WinXP64 machine. The gfx card is a GF7950GTX (512MB)

Just curious if a GL_RGBA8 texture works. Perhaps it’s size of the buffer, in bytes, that’s causing the unsupported error (128MB).

4096x4096 at GL_RGBA16F = 8*16MB =256MB which is half the total VRAM you have, might be that there is just not 256MB left… though I’d think the creation of the texture memory would have failed… did you check for GL errors after each glTexImage2D call?

409640968 is 128MB, not 256.

Oops, indeed it it :smiley: which makes the error caused by out of memory not terribly likely.

Update:
GL_RGBA8 works with 4096x4096 right away
GL_RGBA16F works up to (and including) 3872x3872; any size above will result in GL_FRAMEBUFFER_UNSUPPORTED

Disclaimer: by “works” I mean, it will create the texture and all its mipmap levels and attaching level 0 to an FBO results in GL_FRAMEBUFFER_COMPLETE.

out of morbid curiosity, what of GL_RGBA32F ?

GL_RGBA32F works up to (and including) 2748x2748

3872x3872 * 8 = 119,939,072 byte
2748x2748 * 16 = 120,824,064 byte

Looks like the “magic” barrier is about 120MB. WTF?!

Are you surprised that hardware 6 generations old has some arbitrary limitations on how many bytes a framebuffer can take up? There is a reason why GL_FRAMEBUFFER_UNSUPPORTED exists: to allow drivers to regurgitate it for cases like this, where the hardware has some limitation that is not easily specified.

Yes, this error was unexpected, because everything else seemed to fit at first glance. I am aware that this might just be a hardware limitation. But it could be some some bug as well. That’s why I posted it here: to get some confirmation from the people that should know.

On the age of the gfx card: its still a pretty good card for its age and gets regular driver updates. I always try to get our stuff run on the lowest level gfx card generation possible. Its often not possible to tell our customers: just go, buy a new card!
On ATI the situation is worse: sometimes it happens that bugs remain in the driver forever, because they already stopped developing drivers for hardware that is still in use by our customers.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.