glXGetProcAddress relocation error

Hi,

I compiled and linked successfully one of the wonderful demo at: http://esprit.campus.luth.se/~humus/

When I run the demo I get:
gt@Colossus:~/Documenti/Framework/Cloth$ ./Cloth
./Cloth: relocation error: ./Cloth: undefined symbol: glXGetProcAddress

What can I do ? I have Nvidia drivers installed. Please help !

Thank,
Colossus

Try changing the call to glXGetProcAddressARB

I did it but I get an error message saying that glXGetProcAddressARB is not declared.

Any other hint ?

Thank you,

Can we please have a FAQ, which is also correct, just for kicks?

Originally posted by Colossus:

When I run the demo I get:
gt@Colossus:~/Documenti/Framework/Cloth$ ./Cloth
./Cloth: relocation error: ./Cloth: undefined symbol: glXGetProcAddress

What can I do ? I have Nvidia drivers installed. Please help !

The spec for ARB_get_proc_address says:

* There's a recursion problem with this feature. The purpose of GetProcAddressARB is to return pointers to extension functions and       GetProcAddressARB is itself such a function! This presents a puzzle to the application developer.
  Implementations must export the glXGetProcAddressARB entry point statically.

In plain English this means you should use glXGetProcAddressARB and glXGetProcAddressARB only. You should not use glXGetProcAddress. If you do, the program will not run on implementations supporting GLX 1.3 or lower. GLX 1.4 adds glXGetProcAddress, but funny enough there’s no god dammed specification available for GLX 1.4. GLX 1.3 supports OpenGL 1.2. GLX 1.4 adds support for 1.3. To add insult to injury, some legacy systems support GLX 1.4, glXGetProcAddress but no glXGetProcAddressARB. To make things even more fun, old generation SGI systems don’t even have a glXGetProcAddress[ARB] even if they do have a handful of extensions.

Originally posted by m2:
In plain English this means you should use glXGetProcAddressARB and glXGetProcAddressARB only.

I hate to reply to myself, but I thought I should add something to that comment.

static void dlGetProcAddress (const GLubyte name)
{
static void *h = NULL;
static void *gpa;

if (h == NULL)
{
if ((h = dlopen(NULL, RTLD_LAZY | RTLD_LOCAL)) == NULL) return NULL;
gpa = dlsym(h, “glXGetProcAddress”);
if (gpa == NULL)
gpa = dlsym(h, “glXGetProcAddressARB”);
}

if (gpa != NULL)
return ((void ()(const GLubyte *))gpa)(name);
else
return dlsym(h, (const char *)name);
}

That’s the portable and safe way of doing this. (Ok, ok, portable to OSes with dlopen, but if your OS doesn’t have that, what are you waiting for dumping it? OS X has something like this, write an abstraction layer if you must) This assumes your application links to the GL dynamically. If you need to dynamically load the GL, too, just replace the NULL in dlopen by the path to the GL.

[This message has been edited by m2 (edited 01-26-2004).]

Thank you so much for your explanation, I really appreciated it

Ehm, I didn’t understand how to solve my problem. glXGetProcAddress is contained in a file, this:

#ifdef LINUX
#include <GL/glx.h>

#define wglxGetProcAddress(a) glXGetProcAddress((const GLubyte *) (a))

#endif

So if I replace glXGetProcAddress with glXGetProcAddressARB I got during the compilation that is not declared. If I leave as it is the compiling and linking goes fine but when I run the executable I got a relocation error and undefined reference. If you know the solution please be patient and reply me so your post will be inserted in google and many people will get the solution. If you don’t have the solution, thanks anyway for your time.

Colossus

GOT THE DEMO WORKING !! Just
edit the file …/OpenGL/glExtensions.h of the framework2 package this way:

#ifdef LINUX
#define GLX_GLXEXT_PROTOTYPES
#include <GL/glx.h>
#define wglxGetProcAddress(a) glXGetProcAddressARB((const GLubyte *) (a))
#endif

And enjoy the wonderful demo ! I wrote this so that in the future some others will be helped to run the demo and enjoy OpenGL !!

Colossus

In other words change it to the ARB call. I politely direct your attention to my first reply.

This #define crap is a really nasty way of getting code to work. You should rip out that wgl defing and replace it in the code or actually implement the wgl function as a wrapper around glx or as a generic function around bote woth a #define in there for each platform code path that way it won’t drive you nuts.

Nasty nasty example code.

Thank you for you specification, but, as I politely replied you, it doesn’t even compile if I just change to ARB, the define IS NEEDED if you want to compile and get the demo working.

Regards,
Colossus

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.