GLenum

Hi,

Simple question: why does GL use #defines for its enums instead of C enums? I think I knew at one point.

thanks

  • Taylor

Well, this is simple and has almost nothing to do with GL but…

Have you ever tried to do bitwise or/and/xor… with C/C++ enums ? Hard, isn’t it ?

I don’t know how hard you think it is, but I think it’s pretty easy to use operator |, & and ^.

enum
{
	enumGL_COLOR_BUFFER_BIT = GL_COLOR_BUFFER_BIT,
	enumGL_DEPTH_BUFFER_BIT = GL_DEPTH_BUFFER_BIT,
};

...

glClear(enumGL_COLOR_BUFFER_BIT | enumGL_DEPTH_BUFFER_BIT);

But maybe I just misunderstood what you meant, in which case I apologise.

This following code cannot be compiled:

enum A
{
a=1,
b,
c
};

A val;

val = a|b;

This is because, I guess, in a enum (at least on C++), doing bitwise or could result in undefined values in the enum (values that aren’t names, so couldn’t be compared later).

If that kind of code works on C, sorry for that point. So maybe then, GL uses defines for more portability (even thought in C++ defines should be avoided for the uses of {typedef and const} or enums).

C/C++ 101?

:smiley:

That’s a problem of converting the result to an enum type. The OR operation is perfectly fine. Since values combined with bitwise operations aren’t really enums, but bitmasks, it doesn’t make sense to use an enum as a parameter to, say, glClear. For functions like glEnable you could use enums, but there you never combine several values anyway.

So the problem either occurs in a place where it doesn’t make sense to use enums, or don’t occur where it does make sense to use them. No problems in practice in other words.