OpenGL 3.X glEnable(GL_TEXTURE_2D) GL_INVALID_ENUM

Hi,

I created an OpenGL 3+ context using the core profile and
removed all deprecated functions according to this nice overview.

But glEnable(GL_TEXTURE_2D) proudly presents error code:

GL_INVALID_ENUM (1280 or 0x0500)

std::string e= GetNewLineSeparatedErrorCodes();
if (e.size())
{
  // Debugger does NOT stop here so no error yet
  int breakhere = 1;
}

glEnable(GL_TEXTURE_2D); 

e= GetNewLineSeparatedErrorCodes();
if (e.size())
{
  // Debugger stops here so error in glEnable(GL_TEXTURE_2D)?
  int breakhere = 1;
}
std::string GetNewLineSeparatedErrorCodes()
{
  string s; 
  static GLenum error;

  while ((error= glGetError()) != GL_NO_ERROR)
  {
    s+= IntToString(error) + "
";
  }

  return s;
}

According to the card this texture state change is
not deprecated so it should work?

Help is really appreciated!

In a modern core profile, there is no “fixed-function pipeline” (i.e. built-in shader generator). glEnable( GL_TEXTURE_2D) is a directive to the fixed-function pipeline’s shader generator that you want to include code to support that texture unit.

Delete it. You’re writing the shader so you decide directly which texture units you’re going to reference. Just glActiveTexture to activate the right texture unit, glBindTexture, then glUniform1i to set the texture unit index on the appropriate sampler uniform.

Thanks a bunch!

I also found my mistake when reading the spec.
I only paid attention to the blue text but not to
normal text starting with a blue section number…

[n.n.n] and [Table n.n] refer to sections and tables in the OpenGL 3.2 compatibility
profile specification, and are shown only when they differ from the core profile.

Texture Application [3.9.19]
Enable/Disable(param)
param: TEXTURE_1D, TEXTURE_2D, TEXTURE_3D, TEXTURE_CUBE_MAP