glGetIntegerv(GL_MAJOR_VERSION) returns GL_INVALID_ENUM on core context

What is the correct way to get the OpenGL version on an OpenGL core context?

I am using:


int major = 0;
int minor = 0;
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);

but both lines return an invalid enum error.

glGetIntegerv says that both tokens are allowed. What gives?

This is on Mac OS X 10.9, using a Nvidia 650M and a core context.

GL_MINOR_VERSION and GL_MAJOR_VERSION are not valid in Opengl 1.1 and for some reason your program might be calling 1.1 version of glGet*. Quick workaround would be to use glGetString(GL_VERSION) instead.

Stephen, this is a very platform-specific question, so you should post it into MacOS section.
I can only confirm that GL_MAJOR_VERSION and GL_MINOR_VERSION are both valid in core profile.
You probably have a problem with an enumeration. I’m not sure how it is defined in your environment on Mac OS.

The values are well known (at least for Win/Linux):


#define GL_MAJOR_VERSION                  0x821B
#define GL_MINOR_VERSION                  0x821C

but try to find why it is not already defined. As far as I know Mac OS doesn’t need glext.h/glcorearb.h. By default MacOS X 10.9 should create GL 4.1 core profile (if not used kCGLOGLPVersion_Legacy).

In Windows, drivers aways return highest supported version. Even if created with wglCreateContext(), the context is highest supported with compatibility profile. wglCreateContextAttribsARB() just enables defining values of the attributes (choosing a context type). Early GL 3.x drivers created GL 2.1 context with wglCreateContext() call, but nowadays it is always the highest supported.

This is incorrect. By default OSX creates a Legacy 2.1 context, because Core Profile is not backwards compatible. Apps must explicitly request Core Profile to get it. (Note that OSX does not support a Compatibility Profile version higher than 2.1.) Circa Mavericks, the returned Core Profile version will either be 3.3 or 4.1, depending on the driver’s capabilities. (Also note that OSX supports heterogeneous drivers in a single context, and the version (and extensions, limits etc) can change when the virtualscreen changes.)

In the OSX SDK you will find:


    OpenGL/OpenGL.h, gltypes.h      // platform
    OpenGL/gl.h, glext.h            // Legacy Profile
    OpenGL/gl3.h, gl3ext.h          // Core Profile

The unfortunately-named gl3.h was shipped in Lion, before the ARB had renamed glcorearb.h. This header includes only Core Profile API (for everything in GL4.1, circa Mavericks) but unlike glcorearb.h does not include extensions.

I suspect your pixel format is just wrong. Verify:

printf("%s %s
", glGetString(GL_RENDERER), glGetString(GL_VERSION));

I can confirm that if you’ve actually created a Core Profile context, the version enums return the proper values.
Alternatively, for desktop GL (not ES) it is fairly safe to derive the version this way:

float version;
sscanf(glGetString(GL_VERSION), "%f", &version);

…which unlike GL_MAJOR_VERSION, will work in both Legacy and Core profiles.

ES needs a slightly different approach due to its spec’d string formatting requirements.

Thanks for the clarification! I have never program on Mac. My assumption was relied on other peoples claims.

I also thought it is based on GPU capabilities, not driver ones. OK, it can be thought on that way, since OS delivers a driver bundle. When I said 4.1, I meant for GF6xx.

[QUOTE=arekkusu;1257660]The unfortunately-named gl3.h was shipped in Lion, before the ARB had renamed glcorearb.h. This header includes only Core Profile API (for everything in GL4.1, circa Mavericks) but unlike glcorearb.h does not include extensions.[/QUOTE]Also good to know. Thanks!