Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 5 of 5

Thread: glGetIntegerv(GL_MAJOR_VERSION) returns GL_INVALID_ENUM on core context

  1. #1
    Member Regular Contributor
    Join Date
    Oct 2006
    Posts
    352

    glGetIntegerv(GL_MAJOR_VERSION) returns GL_INVALID_ENUM on core context

    What is the correct way to get the OpenGL version on an OpenGL core context?

    I am using:
    Code :
    int major = 0;
    int minor = 0;
    glGetIntegerv(GL_MAJOR_VERSION, &major);
    glGetIntegerv(GL_MINOR_VERSION, &minor);

    but both lines return an invalid enum error.

    glGetIntegerv says that both tokens are allowed. What gives?

    This is on Mac OS X 10.9, using a Nvidia 650M and a core context.
    [The Open Toolkit library: C# OpenGL 4.4, OpenGL ES 3.1, OpenAL 1.1 for Mono/.Net]

  2. #2
    Intern Newbie
    Join Date
    Jan 2014
    Posts
    40
    GL_MINOR_VERSION and GL_MAJOR_VERSION are not valid in Opengl 1.1 and for some reason your program might be calling 1.1 version of glGet*. Quick workaround would be to use glGetString(GL_VERSION) instead.

  3. #3
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,144
    Stephen, this is a very platform-specific question, so you should post it into MacOS section.
    I can only confirm that GL_MAJOR_VERSION and GL_MINOR_VERSION are both valid in core profile.
    You probably have a problem with an enumeration. I'm not sure how it is defined in your environment on Mac OS.

    The values are well known (at least for Win/Linux):
    Code :
    #define GL_MAJOR_VERSION                  0x821B
    #define GL_MINOR_VERSION                  0x821C
    but try to find why it is not already defined. As far as I know Mac OS doesn't need glext.h/glcorearb.h. By default MacOS X 10.9 should create GL 4.1 core profile (if not used kCGLOGLPVersion_Legacy).

    In Windows, drivers aways return highest supported version. Even if created with wglCreateContext(), the context is highest supported with compatibility profile. wglCreateContextAttribsARB() just enables defining values of the attributes (choosing a context type). Early GL 3.x drivers created GL 2.1 context with wglCreateContext() call, but nowadays it is always the highest supported.

  4. #4
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    782
    Quote Originally Posted by Aleksandar
    By default MacOS X 10.9 should create GL 4.1 core profile
    This is incorrect. By default OSX creates a Legacy 2.1 context, because Core Profile is not backwards compatible. Apps must explicitly request Core Profile to get it. (Note that OSX does not support a Compatibility Profile version higher than 2.1.) Circa Mavericks, the returned Core Profile version will either be 3.3 or 4.1, depending on the driver's capabilities. (Also note that OSX supports heterogeneous drivers in a single context, and the version (and extensions, limits etc) can change when the virtualscreen changes.)

    Quote Originally Posted by Aleksandar
    As far as I know Mac OS doesn't need glext.h/glcorearb.h

    In the OSX SDK you will find:
    Code :
        OpenGL/OpenGL.h, gltypes.h      // platform
        OpenGL/gl.h, glext.h            // Legacy Profile
        OpenGL/gl3.h, gl3ext.h          // Core Profile

    The unfortunately-named gl3.h was shipped in Lion, before the ARB had renamed glcorearb.h. This header includes only Core Profile API (for everything in GL4.1, circa Mavericks) but unlike glcorearb.h does not include extensions.





    Quote Originally Posted by Stephen A View Post
    This is on Mac OS X 10.9, using a Nvidia 650M and a core context.
    I suspect your pixel format is just wrong. Verify:
    Code :
    printf("%s %s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));

    I can confirm that if you've actually created a Core Profile context, the version enums return the proper values.
    Alternatively, for desktop GL (not ES) it is fairly safe to derive the version this way:
    Code :
    float version;
    sscanf(glGetString(GL_VERSION), "%f", &version);
    ...which unlike GL_MAJOR_VERSION, will work in both Legacy and Core profiles.

    ES needs a slightly different approach due to its spec'd string formatting requirements.

  5. #5
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,144
    Quote Originally Posted by arekkusu View Post
    This is incorrect. By default OSX creates a Legacy 2.1 context.
    Thanks for the clarification! I have never program on Mac. My assumption was relied on other peoples claims.

    Quote Originally Posted by arekkusu View Post
    Circa Mavericks, the returned Core Profile version will either be 3.3 or 4.1, depending on the driver's capabilities.
    I also thought it is based on GPU capabilities, not driver ones. OK, it can be thought on that way, since OS delivers a driver bundle. When I said 4.1, I meant for GF6xx.

    Quote Originally Posted by arekkusu View Post
    The unfortunately-named gl3.h was shipped in Lion, before the ARB had renamed glcorearb.h. This header includes only Core Profile API (for everything in GL4.1, circa Mavericks) but unlike glcorearb.h does not include extensions.
    Also good to know. Thanks!

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •