Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 8 of 8

Thread: GLEW and the core profile

  1. #1
    Intern Contributor
    Join Date
    Mar 2014
    Posts
    65

    GLEW and the core profile

    So today I was finally ready to switch over my application to a core profile - and I got a crash.

    Upon investigation I noticed that GLEW doesn't seem to retrieve any functions for vertex array objects.

    The crashes happened because both glGenVertexArrays and glBindVertexArray are NULL.

    Unfortunately GLEW's source code is quite messy and I couldn't fully comprehend what it does. All I noticed is that it still uses glGetString(GL_EXTENSIONS) to retrieve the extension string - which of course has been deprecated for the core profile. So my question now is: Is GLEW even capable of running on a core profile context or not? Do I have to switch to another GL loader library to make it work?

    This happened on a Geforce 550Ti, latest drivers, while requesting a GL 3.3. core profile context, GLEW version 1.10.0

  2. #2
    Intern Contributor
    Join Date
    Mar 2014
    Posts
    65
    Nevermind, I already found something myself.

    Still, are those GLEW guys complete idiots or what, still insisting on making their library depend on a deprecated feature? I don't get it.

  3. #3
    Intern Contributor
    Join Date
    Oct 2011
    Posts
    73
    I'm sorry, glGetString is deprecated? Since when?

    http://www.opengl.org/wiki/GLAPI/glGetString

    Core in version 4.4
    Core since version 1.0
    I don't use GLEW anymore due to various reasons, main one being I switched to glLoadGen because of the lovely gl:: scoped headers it can create. I simply can't get enough of that. However, the usage of glGetString is neither deprecated nor wrong and the problem you are having is, as far as I can tell, a problem with the recent nVidia drivers. I have not spent much time trying to diagnose this, but I can tell you that this (or at least something similar to this) happens to me only when running a debug build of my current application under nVidia (when compiled with full optimizations it does not crash) and it can be "solved" by deactivating "Threaded Optimization" on the nVidia Control Panel.

    I'm not claming that it necessarily is a bug on the drivers, but it's certainly something that has surfaced in the more recent editions of it. I still have to sit down and analyze my code to see if I find any strange artifacts, but I've isolated it enough that I can make it crash by simply creating a core profile and calling glGetString right after (that is, skipping glLoadGen's initialization completely).

    You may want to try out the work around I mentioned and see if that fixes your problem.

    My money is on a driver bug, but for now I'm giving them the benefit of the doubt till I have time to address this.

    EDIT: Disregard what I said about glGetString crashing. It's wglGetProcAddress what causes the crash, as I explain in the follow up.

    Quote Originally Posted by Nikki_k View Post
    Nevermind, I already found something myself.
    So what did you find? It would be nice to share your findings, shall someone down the line tries to search for information on this and finds your post.
    Last edited by Ed Daenar; 07-16-2014 at 06:08 AM.

  4. #4
    Junior Member Regular Contributor
    Join Date
    Dec 2009
    Posts
    236
    Quote Originally Posted by Ed Daenar View Post
    glGetString is deprecated? Since when?
    It's not just deprecated, glGetString(GL_EXTENSIONS) is not available in core profiles, you have to use glGetStringi().

  5. #5
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,267
    Quote Originally Posted by Ed Daenar View Post
    I'm sorry, glGetString is deprecated? Since when?

    http://www.opengl.org/wiki/GLAPI/glGetString
    Use of GL_EXTENSIONS as a parameter to glGetString is deprecated. In other words:


    • glGetString is not deprecated...
    • ...but GL_EXTENSIONS is no longer a valid parameter.


    Please see the documentation at https://www.opengl.org/sdk/docs/man3...lGetString.xml -

    Specifies a symbolic constant, one of GL_VENDOR, GL_RENDERER, GL_VERSION, or GL_SHADING_LANGUAGE_VERSION. Additionally, glGetStringi accepts the GL_EXTENSIONS token.

  6. #6
    Intern Contributor
    Join Date
    May 2013
    Posts
    94
    GLEW is designed with some very wrong assumptions and it doesn't seem that it will be fixed anytime soon.

    The problem you encountered is that GLEW only looks for the extensions string. But in core profiles some features are always present and therefor OpenGL does not shows any extensions string for them. But then GLEW won't even try to get the function pointers...

    Also GLEW checks against the existence of all function pointers of an extension and tells you the extension does not exist if even a single function pointer is NULL.
    This for example causes it to always tell you there is no GL_EXT_direct_state_access.

    For my own use I just changed the scripts that creates the c/h files and removed some of this nonsens "checks" that break stuff.
    The relevant file is: glew-1.10.0/auto/bin/make_list.pl
    Here is my dirty fix (v 1.10.0): http://pastebin.com/3F7kLEnQ

    To build glew on linux go in the glew-1.10.0 directory and:
    cd auto
    make clean;make
    cd ..
    make clean;make

  7. #7
    Intern Contributor
    Join Date
    Oct 2011
    Posts
    73
    Quote Originally Posted by mhagain View Post
    Use of GL_EXTENSIONS as a parameter to glGetString is deprecated. In other words:


    • glGetString is not deprecated...
    • ...but GL_EXTENSIONS is no longer a valid parameter.


    Please see the documentation at https://www.opengl.org/sdk/docs/man3...lGetString.xml -
    Ah, indeed, even the wiki page http://www.opengl.org/wiki/GLAPI/glGetString says this about the token:

    GL_EXTENSIONS​
    For glGetStringi only, returns the extension string supported by the implementation at index​. The index index​ is on the range [0 to glGetIntegerv(GL_NUM_EXTENSIONS)​ - 1].
    However, since we are talking about nVidia drivers as well here, I have to point out something that I had wrong on my initial post: the nVidia drivers crash when calling wglGetProcAddress() after a core profile is made under the circumstances I outlined, not glGetString(). So the problem with the OP may not be the same as what I'm encountering

  8. #8
    Intern Contributor
    Join Date
    Mar 2014
    Posts
    65
    Quote Originally Posted by Ed Daenar View Post
    So what did you find? It would be nice to share your findings, shall someone down the line tries to search for information on this and finds your post.

    Putting

    glewExperimental=TRUE;


    before the call to glewInit will make it ignore the extension string and try to initialize everything it has. The entire library is a cruel joke that operates under completely obsolete assumptions, that JUST happen to be right for compatibility profiles, but only if all extensions get reported. GLEW doesn't classify functions by version, just by extension, it seems.

    I would have dumped it for another loader library, too, if the project wasn't so utterly dependent on it and changing this would amount to a multi-day undertaking.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •