View Full Version : Using GLEW and Core Profile without glError

03-01-2016, 07:21 AM

I'm using the function glewInit() with a OpenGL context in core profile (3.3 or 3.1) and this function return a glError 1280 (with glGetError();).
I try with glewExperimental = GL_TRUE; but the result is the same.

How can I use GLEW with a core profile context ?


03-01-2016, 07:29 AM
Don't use glGetError to check the result of glewInit; GLEW is just a convenience wrapper for loading OpenGL function pointers, but is nothing to do with actual OpenGL execution.

Read the GLEW documentation for the correct way to test initializing GLEW: http://glew.sourceforge.net/basic.html - check that glewInit returns GLEW_OK.

03-01-2016, 07:37 AM
In my application, glewInit returns GLEW_OK.
But the initialization of glew create an opengl error (1280).
I read that "GLEW has a problem with core contexts" here (https://www.opengl.org/wiki/OpenGL_Loading_Library).
So how used GLEW without OpenGL error ?

For the moment my app works but it isn't clean in my opinion.

carsten neumann
03-01-2016, 08:46 AM
Last time I checked GLEW was using glGetString(GL_EXTENSIONS) to obtain the supported extensions. That is not supported on core profile contexts (and causes an OpenGL error), where you must use glGetStringi(GL_EXTENSIONS, i) to obtain the supported extensions individually. You can set glewExperimental = GL_TRUE before calling glewInit() and do a glGetError() call afterwards to swallow the error caused by glewInit() - that seems to work for me. Use the return value of glewInit as mhagain says to check if glewInit succeeded.

03-01-2016, 08:51 AM
Is not good for me.

GLuint err = glGetError(); //0
glewExperimental = GL_TRUE;
if (glewInit() != GLEW_OK) {
std::cout << "ERROR: GLEW Failed to init." << std::endl;
return false;
err = glGetError(); //1280

The question is : I need to modify glew to force it to use glGetStringi(GL_EXTENSIONS, i) ?

Thanks for your reply

carsten neumann
03-01-2016, 08:57 AM
What exactly is not working? Do you exit early after printing "GLEW Failed to init"?
The GL error after glewInit is expected, you just ignore it (it is not a problem in practice). That way you can distinguish "real" GL errors caused by your application from this one.

To avoid the error GLEW would need to be modified to use glGetStringi - on core profile contexts. On OpenGL 3.1 and earlier contexts (or whenever glGetStringi was introduced - I don't remember off-hand) it would have to continue to use glGetString. I looked into it once and sent a patch (https://sourceforge.net/p/glew/bugs/174/#1ba8), but it was causing problems on the authors system, so was not ready to be accepted.

03-01-2016, 09:16 AM
Ok so I simply use glGetError() to reset OpenGL error.
I'm going to wait GLEW 2.0 ^^

Thanks a lot !

03-01-2016, 09:36 AM
You're quite correct, by the way - this is not clean.

However it's the way GLEW currently works so, unless we each make our own custom versions of GLEW, it's what we have to live with.

03-01-2016, 09:46 AM
Use GLAD instead, it lets you choose what extensions you need.

03-02-2016, 12:03 AM
I didn't know GLAD. It seen good :)