ARB vertex/fragment shader programming on Linux?

I’m trying to play around with vertex and fragment programming on Linux and I’ve run into a wall.

I have an ATI Radeon 9700 and have installed the ATI Linux drivers for XFree86 version 4.2.1. I also have the DRI version of Mesa installed so that when I query the supported extensions using glGetString(GL_EXTENSIONS) in my program, I can see GL_ARB_vertex_program and GL_ARB_fragment_program in the list.

However, because the Mesa headers and libraries installed on my system don’t support the function calls and constants necessary to load vertex and fragment programs, I can’t compile my program.

I tried compiling and installing the latest version of Mesa (5.0.1), and while it has the right functions for loading vertex/fragment programs onto the graphics card, it doesn’t support the ARB vertex and fragment extensions (only the NV ones). Also, it’s not hardware accelerated since it doesn’t use DRI.

So I was wondering if it’s currently possible to do shader programming on Linux using OpenGL with the ARB extensions?

NVidia provide the appropriate headers and libraries with their Linux drivers. Do ATI not do the same?

If not, use the extension specifications to find the constants and function signatures and create your own header. Use glxGetProcAddress(EXT|ARB) to load the functions at runtime.

You should forget about MESA as long as you want to benefit hardware acceleration.
First of all, uninstall MESA and get back to your ATI implementation.

Seconly, if you want to use extensions, you may need to get the “glext.h” header which is available at the OpenGL Extension Registry, be it for fancy extensions like fragment programs or for wide-spread extensions like multitexturing.

And in order to load extensions, you may also need the glX headers in order to use the glXGetProcAddress function, at least if you’re running X.

Here’s what ATI provide with their Linux driver:

  • source for compiling their kernel module
  • libGL.so.1.2
  • configuration and test programs
  • X driver, DRI and DRM support in X
  • sample code

So no OpenGL headers… not a big deal though, I just grabbed the ones from the extension registry site.

However, I checked the list of symbols in ATI’s libGL.so.1.2 using nm and couldn’t find glGenProgramsARB(), glBindProgramARB(), glProgramString(), or glGetProgramiv() so I’m pretty sure that using glXGetProcAddress() to load those functions at runtime isn’t going to work.

I did see a link to some other ATI driver in Germany posted up in a few places. Anyone know if that driver provides the necessary files for shader development?

Yes. I have a Radeon 9800pro running on RedHat 9. I downloaded the glx1_linux_X4.3.zip driver from http://www.schneider-digital.de/html/download_ati.html and it works fine. I’ve been playing around with fragment programs for a while now.

Oh, and I’m using an extension loading library. I think it’s extgl.

[This message has been edited by mogumbo (edited 07-23-2003).]

Thanks for the link. I downloaded those drivers and converted the .rpm file they contain to a .tgz file (I use Debian).

However, it’s really not my day… I unpacked the files to a temporary directory just to see what the package contained, and then typed the following command to start removing the unpacked files from the temp directory:

rm -rf /usr

oops…

I stopped it, but not before I wiped out all of /usr/bin (and some other stuff). And yes, I dunno why I didn’t just remove the entire temp directory.

I’m now learning everything I didn’t want to know about ext3 filesystem undeletion.

Yup, really not my day…

[This message has been edited by luxo (edited 07-23-2003).]

The very Gold rule : never login as root.

At least, try to keep root sessions minimal. For instance, don’t open a console with the root user. Prefer the ‘su -c’ command instead.

Yeah… technically, I wasn’t logged in as root but I was doing a bunch of rooty stuff and was too lazy to type “sudo” before each command so I just used “su”.

I’m in the process of upgrading my computer and wanted to move all my files to a newer, bigger drive anyways so I guess this is as good an excuse as any to reinstall Linux. It’s been about 4 years with the same install now and I’ve built up a lot of cruft on my system. And though I’ve discovered that ext3 undeletion is possible, undeleting that many files is far too painful.

Anyways, this is about as far removed from OpenGL as you can get, so I’ll stop posting and get to reinstalling.

Download latest extension library headers
(glext.h, glxext.h) from:
[http://oss.sgi.com/projects/ogl-sample/registry/`](http://oss.sgi.com/projects/ogl-sample/registry/`)

Fast and ugly way:

And simple overwrite existing (mesa header) files in /usr/include/GL/

Note: package upgrading is dangerous in this case !

Or

Create your own modified package with new extension header files.

Regards,
Potyi

Yeah, I did that and I was still getting compilation errors about the functions I needed not being defined.

I finally figured out the problem by grepping through glext.h though. I needed to define GL_GLEXT_PROTOTYPES in order to have those functions defined.

Did that and now my program compiles fine and loads up my vertex program. Now that I’ve got past that part, I just need to display something to see if it works.

Hopefully, it won’t lock up my system like all the OpenGL games I’ve tried have (Unreal Tournament, Quake 3, UT2003)…

@mogumbo / @all:

since i’m new to OpenGL dev. on linux based systems, this question come’s up while reading the thread:
>>Oh, and I’m using an extension loading >>library. I think it’s extgl.

i thought that i don’t need any additional stuff, as i need under windows, under linux to activate the usage of GL extensions ??!?!
and this also that what i’ve heard several times, not only by reading the linux threads on this forum but also by reading threads on other forums.
could anyone explain this to me, please ?

If you happen to use glXGetPRocAddress, you don’t need to use any extension loading library. glXGetPRocAddress is supported to any decent X-Windowing system and is available through the GLX_ARB_get_proc_address extension (the second ARB extension ever written, just after GL_ARB_multitexture)

@vincoof:

>>…GLX_ARB_get_proc_address…

mmmh, yes; but the step i have to do with this way is “nearly the same” as under windows: i have to “load” the extensions/functions at the startup of my program - by reading some of those linux-GL-related threads the most time it sounds to me that i “just have to use them”, because of that the extensions/functions are “integrated into the core” ??! and by using the way described by you, i still have to “load them manually”, if caught this correct ?!?!

The good point with the OpenGL implementation under linux is that you can use the function pointer directly without loading them (well, in this case you assume that the end-user uses a decent hardware and driver).

For instance, when you want to use multitexturing, you don’t have to load the ARB_multitexture extension if OpenGL 1.3 is supported. You can simply use glActiveTexture which is already available instead of glActiveTextureARB that you initialize manually.

In your code, you may write :
#ifdef GL_VERSION_1_3
glActiveTexture(GL_TEXTURE1);
#else
if (isArbMultitextureSupported) glActiveTextureARB(GL_TEXTURE1_ARB);
#endif

where glActiveTexture will be initialized automatically, and where glActiveTextureARB has to be initialized manually like that :
glActiveTextureARB = (PFNGLACTIVETEXTUREARBPROC)glXGetProcAddress(“glActiveTextureARB”);

Under windows, the ‘automatic’ solution is not possible (at least not now) since only OpenGL1.1 features are available this way.

I seem to mess things up. Please tell me if I’m not clear

[This message has been edited by vincoof (edited 08-28-2003).]

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.