View Full Version : OpenGL spec error for texture units ?

10-26-2011, 10:44 AM
How you can specify texture units greater than GL_TEXTURE31 ?
MINIMUM 48 for OpenGL 3.x ( http://www.opengl.org/sdk/docs/man3/xhtml/glActiveTexture.xml )
MINIMUM 80 for OpenGL 4.x ( http://www.opengl.org/sdk/docs/man4/xhtml/glActiveTexture.xml )

"The number of texture units is implementation dependent, but must be at least 48.
texture must be one of GL_TEXTUREi, where i ranges from 0 (GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS - 1). "

This is not true as GL_TEXTURE0 + 32 = GL_ACTIVE_TEXTURE enum....

I understand that it just works because driver takes enum parameter
and treats it just like a number from which it substracts GL_TEXTURE0 value to obtain true texture unit number?

Dan Bartlett
10-26-2011, 11:27 AM
The specification also stipulates that:

The constants obey TEXTUREi = TEXTURE0+i (i is in the range 0 to k - 1, where k is the value of MAX_COMBINED_TEXTURE_IMAGE_UNITS).

So just use GL_TEXTURE0 + i to choose the active texture unit.

Alfonse Reinheart
10-26-2011, 11:31 AM
Also, note that the man pages are not the OpenGL specification.

10-29-2011, 11:06 AM
And suddenly 'GLenum' system breaks down, who would have thought.

Alfonse Reinheart
10-29-2011, 02:01 PM
What "GLenum system" are you referring to? This has always been how `glActiveTexture` works.

10-30-2011, 07:09 AM
I'm referring to insistence of keeping everything in single namespace (or rather value-space) - assigning distinct values to parameters that cant appear in respective contexts, say buffer targets and texture targets. As OP notices, there are collisions in some places.

A sad consequence is that you cant have
const char* EnumToString(GLenum);
function, even though you _almost_ can.

Otherwise, just my ramblings about how OpenGL sucks here and there. Yes, i was off-topic, sorry ;)