Have a problem trying to determine the maximum (usable) texture size on NVidia 8800 GTS (512 MB) (where MAX_TEXTURE_SIZE is 8192).
As far as I know, the recommendation is to use (for instance) glTexImage2D with a target of GL_PROXY_TEXTURE_2D. This works with the exception that the Internal Format does not seem to be taken into consideration?!
The following generates an OUT_OF_MEMORY:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, 8192, 8192, 0, GL_RGB, GL_UNSIGNED_BYTE, imageData) // typo corrected
But there’s NO objection when probing with GL_TEXTURE_PROXY_2D:
glTexImage2D(GL_TEXTURE_PROXY_2D, 0, GL_RGB8, 8192, 8192, 0, GL_RGB, GL_UNSIGNED_BYTE, null)
glGetTexLevelParameter(GL_PROXY_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &suceed)
This surprises as I understand the specification to try for both resolution AND format (http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml).
With 512 MB, something like the following should fail as I understand the spec (but it reports no problem according to glGetTexLevelParameter):
glTexImage2D(GL_TEXTURE_PROXY_2D, 0, GL_RGBA32F_ARB, 8192, 8192, 0, GL_RGBA, GL_UNSIGNED_BYTE, null)
Any comments on this? Have I misunderstood something?