ARB_uniform_buffer_object on OpenGL 2.1

Hi there

I’d like to play around with uniform buffer objects, but several problems have come up. I cannot create an OpenGL 3.2 context, due to other reasons, so i am stuck with 2.1, but as i interpret the specification, it should run there just fine.

The problem is, when i write

#extension GL_ARB_uniform_buffer_object : enable

into the shader, the nVidia driver (191.xx) complains that extension were not supported (OpenGL extension viewer tells me otherwise).

Do i need to do anything else? I cannot use #version 140/150, but the specification says it is written agains GLSL 1.2 anyway.

Any ideas?

And apart from the shader that won’t compile, i also don’t have the function-pointers initialized yet, because GLEW and GLEE are not updated in ages and don’t know that extension, at all, so far. I really don’t like initializing extensions manually, so IF i find out how to convice the GLSL compiler to support that extension, is there also some quick and easy way to initialize the function pointers?

Thanks,
Jan.

Is the extension advertised in the extension string?

Well, the OpenGL extension viewer claims the extension were supported. However it also says this:

“GL_ARB_uniform_buffer_object has the entry point GetIntegeri_v missing!
GL_ARB_uniform_buffer_object has the entry point UniformBlockBinding missing!”

I am a bit confused, what that’s supposed to mean, because at least glUniformBlockBinding seems to be quite important.

Did anyone ever use this extension with a 2.1 context ?

Jan.

I think I went straight for the gusto with GL3.2. Last extension I used was bindable uniform circa 3.0 or thereabouts. IIRC you may need to enable EXT_gpu_shader4.

Pity you can’t do the 3.2 thing but once you do there’s no going back so be warned!

P.S. Yep, GLEW seems to be in a holding pattern over GL3.0.

Yes, i already enable EXT_gpu_shader4.

The problem with a 3.0 context is, that it is a major pain to get it working with Qt and since it is a bigger university project that i am working on, i can’t just hack it into it (IF i knew how), other people would be angry. Also i would need to rewrite every shader and some libraries and it all piles up to a lot of work, that i don’t have time for.

Jan.

“GL_ARB_uniform_buffer_object has the entry point GetIntegeri_v missing!
GL_ARB_uniform_buffer_object has the entry point UniformBlockBinding missing!”

I’m not involved in any driver development, but it looks to be a common issue to support an extension but not provide some function pointers for it.
I faced the problem while trying to get instancing work in GL 3.1: the GL_ARB_Draw_Instanced is supported by Catalyst 9.10, but there is no glDrawElementsInstanced & glDrawArraysInstanced…
I guess the ARB equivalent function names might give you the same required pointers - try it.

You can get the latest glew from the SVN and then just run the make destroy in /auto from a linux shell, to create the headers from the extension specs in the registry. I just did that just yesterday and now I have 3.2 support, as well the latest extensions. =)

I guess the ARB equivalent function names might give you the same required pointers

GL_ARB_draw_instanced is a core OpenGL extension; there are no ARB equivalent functions. Both the extension and the core version should use “glDrawElementsInstanced.” And if they do advertise an ARB equivalent, then that’s even more of a driver bug.

Although all this information is helpful, it doesn’t answer my main question: Does the uniform_buffer_object extension work AT ALL with a 2.1 context with current nVidia drivers?

I have currently not even come to the point to use any functions of the extension, i simply can’t get a GLSL shader to compile, because it rejects the syntax and the #extension : enable stuff.

Jan.

Does the uniform_buffer_object extension work AT ALL with a 2.1 context with current nVidia drivers?

Should it?

In OpenGL 3.2, a new extension was added: WGL_ARB_create_context_profile. Before 3.2, an OpenGL implementation could only give you a 3.0 or greater implementation if you specifically ask for it with WGL_ARB_create_context.

The new “_profile” extension allows it to give you a 3.2 compatibility context, even if you don’t use WGL_ARB_create_context. Indeed, this extension allows the implementation to give you any higher level implementation, so long as it retains backwards compatibility with what you asked for.

Which NVIDIA’s 3.2 implementation does. In short, you can no longer get just a 2.1 implementation on any NVIDIA hardware that supports GL 3.0. Because of that, and because any hardware that supports UBOs supports GL 3.0, there’s no reason to limit yourself to GLSL 1.2. You’re getting version 3.2, whether you like it or not. So you may as well use it :wink:

God damn it!

I have no influence on the context creation! I have a 2.1 context, i cannot do anything about it! If i could i would have created a 3.x context weeks ago. The context creation is the part of the software that needs to stay as it is right now, if i would try to fiddle with it people might get angry with me. So i am stuck with a 2.1 context.

But the uniform_buffer_extension is clearly written against OpenGL 2.1 (1.5 actually) so it should work just fine with a 2.1 context. However it seems not to. So i ask whether someone has done it. If i wanted a discussion about OpenGL 3.x vs 2.1 i would have created a seperate thread for that. I’m not a noob, i know what i’m doing. And i have clearly explained that i cannot use a 3.x context in the very first sentence of my first post, so please read that stuff, before being a wise-guy next time.

So i assume no one has done this with a 2.1 context and from what i’m seeing it seems not to work yet on nVidia cards.

Jan.

So i am stuck with a 2.1 context.

Did you fully read what I said? NVIDIA drivers will give you a 3.2 compatibility context whether you ask for it or not. That is, standard context creation using wglCreateContext will give you 3.2 compatibility unless you use wglCreateContextAttribs and specifically tell it not to.

I actually ran into an issue with this, because my GL extension loading code will throw an exception if a function required by an extension or core version isn’t actually available. I was using regular wglCreateContext, and I was getting back 3.2. The problem was that NVIDIA implemented it wrong: they didn’t set the core vs. compatibility flag, which is supposed to be set, and they didn’t actually expose all of the 3.2 core functions. So my loading code kept throwing exceptions.

Update your drivers and check what version you get from context creation.

I believe thats the same issue i have at the moment.

See http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=268095#Post268095

Is there any workaround possible or would a bugreport for the driver vendor the right place to give feedback about this?