GL_PROXY_TEXTURE_2D ignores Internal Format ?

Have a problem trying to determine the maximum (usable) texture size on NVidia 8800 GTS (512 MB) (where MAX_TEXTURE_SIZE is 8192).

As far as I know, the recommendation is to use (for instance) glTexImage2D with a target of GL_PROXY_TEXTURE_2D. This works with the exception that the Internal Format does not seem to be taken into consideration?!

The following generates an OUT_OF_MEMORY:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, 8192, 8192, 0, GL_RGB, GL_UNSIGNED_BYTE, imageData) // typo corrected

But there’s NO objection when probing with GL_TEXTURE_PROXY_2D:

glTexImage2D(GL_TEXTURE_PROXY_2D, 0, GL_RGB8, 8192, 8192, 0, GL_RGB, GL_UNSIGNED_BYTE, null)
glGetTexLevelParameter(GL_PROXY_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &suceed)

This surprises as I understand the specification to try for both resolution AND format (http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml).

With 512 MB, something like the following should fail as I understand the spec (but it reports no problem according to glGetTexLevelParameter):

glTexImage2D(GL_TEXTURE_PROXY_2D, 0, GL_RGBA32F_ARB, 8192, 8192, 0, GL_RGBA, GL_UNSIGNED_BYTE, null)

Any comments on this? Have I misunderstood something?

A 32-bit 8192x8192 texture “only” takes up 256MB. GL_TEXTURE_PROXY_2D only takes into account whether or not the texture could theoretically fit given the specified dimensions and internal format.

According to the OpenGL Red Book:

Of course, it could also just be that in your non-proxy call to glTexImage2D() you’re passing NULL as the pointer to the texel data, and the driver may be checking for this and refusing.

Yes well, wouldn’t it be 8192 * 8192 * 4 (4 components RGBA) * 4 (32 bit = 4 bytes) = 1GB.

According to the OpenGL Red Book:

Yes but the texture I am trying to allocate memory for is the only texture in the context (and probably also for the whole machine).

I am sorry, that line was a copy-paste typo. The pointer I supply is in fact not a null pointer and points to valid image data.

Trying to allocate memory for the 8192x8192 texture (first code above) with an internal format of RGB8 should not really generate an OUT_OF_MEMORY. It would consume 192 MB(64 MB x 3 for RGB) without mipmaps and be well within the 512 MB limit?! I feel there’s something fishy going on here…

Yes well, wouldn’t it be 8192 * 8192 * 4 (4 components RGBA) * 4 (32 bit = 4 bytes) = 1GB.

??? Calculate that again.

It is ok to pass NULL. This tells GL that you just want to allocate space.

GL_RGB is an oddball format since this takes 24 bit per pixels. This is difficult to optimize for. All GPUs convert it to RGBA.

Anyway, so there is enough space. If you are asking for mipmaps, then that take up even more space : about 340MB
It might be too much for the memory manager.
Try 8192 x 2048

??? Calculate that again.

I wasn’t clear enough, but I was thinking of the texture in GL_RGBA_32F_ARB format. That takes 1GB and therefor I would expect it to fail when using GL_TEXTURE_PROXY_2D. However it doesn’t report a problem, and I feel it strongly reduces the usefulness of GL_TEXTURE_PROXY if the texture format is not taken into consideration.

It might be too much for the memory manager.
Try 8192 x 2048

8192 x 2048 (GL_RGB8) works as expected. But 8192 x 2048 (GL_RGB16) generates OUT_OF_MEMORY, even though GL_TEXTURE_PROXY_2D does not report any problem. It seems that GL_TEXTURE_PROXY is useless for checking if a texture can be allocated sinece it doesn’t take the internal format into consideration. I feel it’s a bug, atleast in the way I read the specifications.

Well, most. At least one embedded GPU can sample from packed 24bit RGB.

Looks like a bug but can’t you just forget about proxy textures. I’m not sure why this feature was added to GL.

I suspect they just didn’t implement this feature properly… Which isn’t a big loss anyway IMHO

It is certainly possible to get along without the feature. It’s just not knowing whether the mechanism is reliable or not that bothered. It seems that GL_PROXY_TEXTURE is not accurate with current NVidia drivers and I will just have to keep it in mind.

Thanks everyone that chipped in on the subject.