View Full Version : Why is glewIsSupported() wrong?

04-19-2011, 11:59 AM
I have the following code:

GLenum err = glewInit();
cerr << glGetString(GL_EXTENSIONS) << "\n";
if (err != GLEW_OK ||
!glewIsSupported("GL_ARB_color_buffer_float GL_EXT_framebuffer_object GL_EXT_packed_depth_stencil"))
capable = false;

The printout of the extensions clearly indicates that all three of those extensions are present, yet the glewIsSupported() call is returning false. If I break it down to three separate checks, the first two work but GL_EXT_packed_depth_stencil does not register as present (despite being in the GL_EXTENSIONS printout).

What could possibly be causing this?

04-19-2011, 11:40 PM
GL_EXT_packed_depth_stencil is part of core OpenGL3.0. Are you sure that u have setup an opengl 3 context? how are u initializing the context? Are u using freeglut? let us know the details about it.

04-20-2011, 02:33 AM
No need for GL 3 context since the date of this extension (2005), cf http://www.opengl.org/registry/specs/EXT/packed_depth_stencil.txt

This extension has no functions, so try to use its tokens, and see if it works.

For about glew, I never used it so I can't know. But try to check glewexperimental (http://glew.sourceforge.net/basic.html), or put less extension in glewIsSupported.

04-29-2011, 11:53 AM
Simply removing the check "solved" the problem, since the extension works without incident. I still find it extremely strange that this was happening though and I'm curious what could have caused it.

I always figured that glewIsSupported was just parsing the string returned by glGetString, but maybe not?

04-30-2011, 01:52 AM
Look at the code if you want to know :)