GLSL and ATI Linux driver

Just acquired a notebook with Radeon Xpress 200M card, installed Linux and ATI Linux driver 8.26.18 on it. OpenGL version reported by fglrxinfo utility is 2.0.5879, so I was thinking that finally I’ll be able to approach writing some OpenGL 2.0 code. However:

  1. There is no OpenGL 2.0 header file installed.
  2. When listing symbols from libGL.so installed, seems like OpenGL 2.0 API is not there - for example, glCreateProgramObject() is not listed there, while glCreateProgramObjectARB() is listed.

So - any hint regarding does this card/driver actually have OpenGL 2.0 support or not, and if so, how to employ it from the code?

Thanks.

Use glew (http://glew.sf.net) or similar tool, and then use the ARB extensions.

I know I could do this, but this is OpenGL 1.5, not OpenGL 2.0 way, right?

Originally posted by Crni Gorac:
2. When listing symbols from libGL.so installed, seems like OpenGL 2.0 API is not there - for example, glCreateProgramObject() is not listed there, while glCreateProgramObjectARB() is listed.

That might be because there is no glCreateProgramObject() in OpenGL 2.0, it’s called glCreateProgram() there. :rolleyes:

Sorry, it should have been glCreateProgram() on the first place - it is not there, only glCreateProgramObjectARB(), as mentioned above…

You have to load everything that’s above OpenGL 1.2 through the extension loading mechanism.

There is no need to check the extension string, just check the version string, then you can get your functions with glxGetProcAddressARB.

As nrg pointed out earlier, you can use glew to automate this process.

OK, got it: so it’s just not possible to state “#include <GL/gl.h>” and then use OpenGL 2.0 API directly with this particular driver/card combination. But then again - is something alike possible with ATI driver and some other ATI cards, and how it goes for Nvidia drivers/cards? Must confess I’m pretty confused by having a statement by manufacturer that card/driver support OpenGL 2.0 and then still have to use extensions to employ this functionality from my code…

Thanks.

Originally posted by Overmind:
You have to load everything that’s above OpenGL 1.2 through the extension loading mechanism.

I thought they were keeping libGL.so up to date.

Originally posted by Crni Gorac:
Must confess I’m pretty confused by having a statement by manufacturer that card/driver support OpenGL 2.0 and then still have to use extensions to employ this functionality from my code…

They are saying you need to get the function pointers because libGL.so doesn’t export them, which means libGL.so is at version 1.2 while the ATI driver is at 2.0
You could in fact get pointers to all GL functions. It doesn’t mean they are extensions.

They are saying you need to get the function pointers because libGL.so doesn’t export them, which means libGL.so is at version 1.2 while the ATI driver is at 2.0
You could in fact get pointers to all GL functions. It doesn’t mean they are extensions.

I understand all of this, what I don’t understand is exactly what is the meaning of “ATI driver is at 2.0” qualification? I was thinking that having this qualification applied would mean precisely what I mentioned above - for developer to be able to just include appropriate “GL/gl.h” and then use glCreateProgram() and alike OpenGL 2.0 functions.

The problem is that there’s an OS component to this too, and while I’m not expert on this on the Linux side I believe this is called the ABI and is still on 1.3. Still better than 1.1 in Windows, but it means you’ll have to dynamically load anything that’s above 1.3, just like you have to load everything above 1.1 dynamically in Windows.

The Linux ABI is 1.2.

I think you guys misunderstand the whole extension/version mechanism of OpenGL.

The libGL.so is an operating system component. It always exports GL 1.2 functions only. That’s really important, because you don’t know the OpenGL version at compile time.

If you need core features of a higher OpenGL version, then you need to load them with glxGetProcAddressARB. This does not mean you are using an extension. Even if it’s the same loading mechanism as with extensions, it’s still a core feature, not an extension.

Now why can’t the display driver just update the libGL.so to export the other symbols?

Assume you would have a libGL.so that exports GL 2.0 functions. What happens when your program is started on a system that supports GL 1.5 only?

Your program won’t start on this system. You don’t even have a chance to fall back to a different render path that does not use GL 2.0 functions.

That’s the reason why the Linux ABI supports only GL 1.2 and the windows ABI supports only GL 1.1. Not because someone forgot to update, but because the ABI specifies a minimum version that everyone HAS to support, otherwise applications will break. Assume the ABI would specify GL 2.0. Then graphic cards that are only capable of GL 1.5 would not work on this operating system.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.