Can't run OpenGL 3.3 programs, possibly due to hybrid graphics card setup

I can’t run opengl 3.3 programs despite my graphics card supporting it, for example the tutorials from arcsynthesis.org/gltut crash on startup. I believe it may have something to do with my hybrid graphics card setup (intel hd 4000 + nvidia gtx 660m). I am running it with optirun, so it uses my dedicated card):


$ make
$ optirun ./tutorial01
freeglut (./Tut 01 MainD): glXCreateContextAttribsARB not found

Output of optirun glxhead (i.e using dedicated card)


GL_VERSION:  4.2.0 NVIDIA 304.88
GL_VENDOR:   NVIDIA Corporation
GL_RENDERER: GeForce GTX 660M/PCIe/SSE2

Output of glxhead (i.e using integrated card)


GL_VERSION:  3.0 Mesa 9.2.1
GL_VENDOR:   Intel Open Source Technology Center
GL_RENDERER: Mesa DRI Intel(R) Ivybridge Mobile

I’m using Ubuntu 13.10

.This autosave feature, destroide my response.

I had half a page typed, until the forum deleted it.

Sorry.

If you print the result of glGetString(GL_RENDERER) in your app, do you get the nvidia or MESA driver?

mcjohnalds, I had same problems on Ubuntu 13.10 with Bumblebee and Nvidia drivers, I do not know why, but it always fails to create OpenGL 3.3 Core Profile contexts. I had’nt a time to resolve a problem so I installed Ubuntu 12.04.4.
On Ubuntu 12.04.4 it works without problems.

If you print the result of glGetString(GL_RENDERER) in your app, do you get the nvidia or MESA driver?

He can’t, context is not created.

Recent versions of Bumblebee removed support for GLX_ARB_create_context. However, you can still create an old style context, and it will return the compatibility profile for the max version. I use something like this:

int ignore_x_errors(Display* display, XErrorEvent* event) { return 0; }
int ignore_x_io_errors(Display* display) { return 0; }

XSetErrorHandler(ignore_x_errors);
XSetIOErrorHandler(ignore_x_io_errors);
int context_attribs[] =
{
  GLX_CONTEXT_MAJOR_VERSION_ARB, 3,
  GLX_CONTEXT_MINOR_VERSION_ARB, 1,
  GLX_CONTEXT_PROFILE_MASK_ARB,  GLX_CONTEXT_CORE_PROFILE_BIT_ARB,
  0
};
GLXContext context = 0;
PFNGLXCREATECONTEXTATTRIBSARBPROC glXCreateContextAttribsARB = 
 (PFNGLXCREATECONTEXTATTRIBSARBPROC)glXGetProcAddressARB((const GLubyte*)"glXCreateContextAttribsARB");
if (glXCreateContextAttribsARB)
  context = glXCreateContextAttribsARB(display, fbc[0], 0, 1, context_attribs);
if (!context)
  context = glXCreateContext(display, vi, 0, 1);  
XSetErrorHandler(0);
XSetIOErrorHandler(0);

Is there any source for this? I can not find any.
I can’t believe this is true, many programs will not work without core profiles.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.