OpenGL spec error for texture units ?

How you can specify texture units greater than GL_TEXTURE31 ?
In specification GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is:
MINIMUM 48 for OpenGL 3.x ( http://www.opengl.org/sdk/docs/man3/xhtml/glActiveTexture.xml )
MINIMUM 80 for OpenGL 4.x ( http://www.opengl.org/sdk/docs/man4/xhtml/glActiveTexture.xml )

"The number of texture units is implementation dependent, but must be at least 48.
texture must be one of GL_TEXTUREi, where i ranges from 0 (GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS - 1). "

This is not true as GL_TEXTURE0 + 32 = GL_ACTIVE_TEXTURE enum…

I understand that it just works because driver takes enum parameter
and treats it just like a number from which it substracts GL_TEXTURE0 value to obtain true texture unit number?

The specification also stipulates that:

So just use GL_TEXTURE0 + i to choose the active texture unit.

Also, note that the man pages are not the OpenGL specification.

And suddenly ‘GLenum’ system breaks down, who would have thought.

What “GLenum system” are you referring to? This has always been how glActiveTexture works.

I’m referring to insistence of keeping everything in single namespace (or rather value-space) - assigning distinct values to parameters that cant appear in respective contexts, say buffer targets and texture targets. As OP notices, there are collisions in some places.

A sad consequence is that you cant have
const char* EnumToString(GLenum);
function, even though you almost can.

Otherwise, just my ramblings about how OpenGL sucks here and there. Yes, i was off-topic, sorry :wink: