Glx 1.3 and nvidia drivers

ok, i was trying to port my app from sdl to straight xlib/glx (i figured sooner would be better than later) using the nice (but short) documentation in the back of the blue book. the problem is, while glx.h includes all the glx 1.3 structures, and whatnot (glx 1.3, not gl 1.3), calling glXChooseFBConfig returns null (but it compiles fine).

it doesnt make sense that they would include headers for 1.3 (yes, im sure i am using nvidia’s) but there isnt support for it. is this really the case, or is there some way to have glx 1.3 support (eg would having my glx.so files not installed correctly cause this)?

i saw a recent post that there wasnt glx 1.3 support, but the fact that the headers include all the stuff for 1.3 seems odd, so i thought i would ask the question directly.

thanks

p.s. if there isnt glx 1.3 support, a) where can i find some good documentation on glx 1.2, and b) does anybody know how soon there will be glx 1.3 support?

OK.

It seems that glx1.3 not supported by nvidia drivers 'cause these ones are modified by nvidia. So, some functions may change in regard the normal ones. that’s may be why.

concerning glx documentation there is a good book named as something like that: opengl for x window.
if you want , i could see it at home (if i think at it)

ok, i was thinking about getting this, but all the descriptions ive read say that it just says how to use glut. i dont want to use glut, but glx.

if it has an indepth description on using glx and xlib, id probably find some place to buy it, but if its mostly about how to use glut or sdl, ill look someplace else.

since you have it, could you say a little about what its emphasis is?

thanks

I think you’ve misunderstood (or maybe me).
The opengl official reference guide is mostly about glut. But i don’t think opengl for xwindow uses it!
otherwise, i have some (too much little) doc about glx, just some routines definitions.
I may give you them.
tell me more

ok, here is my problem:

i want to start using glx and xlib to create windows and handle events, instead of using sdl or glut. i have a good book on xlib programing, and i have both the openGL red and blue books. the blue book has a nice description on how to use glx, but its for glx 1.3. unfortunately, the nvidia drivers only support glx 1.2, so what i need is a good description on how to use glx 1.2. all i really need to know is how to configure the framebuffer in glx 1.2

according to opengl.org, the book “OpenGL for the X Window system” deals primarily with glut. here is a bit of what the description says:

“It uses the OpenGL Utility Toolkit (GLUT) to show how OpenGL programs can be constructed quickly”

but like i said, i know (basically) how to use glx 1.3, so what i really need to know, is how to configure a framebuffer in glx 1.2

thanks

I have no information about configuring the frame buffer with glut.
A way is to read the glut source code, or to find a glx programm. There are some in the mesa libraries.

jidé

ps: do you think it’s faster ?

ok, i do not want to use glut, i want to use glx. id like to use glx 1.3 because i know how to use it, but i cant, so i need to know how to configure the framebuffer in GLX 1.2, not glut, no glut, i do not like glut, i do not want to use glut, i want to use glx!

Sorry for all the misunderstand, but do you know that glut uses glx ?
Then, maybe what you want is in the sources of glut ?
OK, i took a look at my book, and there are some glx functions but not
glXChooseFBConfig().
surely, if it returns null, there are no frame buffer available !

ok, i sorta figured out that you were saying to look at the glut sources, but ive been busy with other stuff, and havent been working on this problem lately. unfortunately, im not very good at reading other peoples code, but if i cant find any good tutorials on glx1.2, ill give it a shot.

as per GLXchooseFBConfig returning null, this is a well known thing about the nvidia drivers. their header files contain all the functions that are new to glx1.3 (of which chooseFBConfig is one, which is why it isnt in your book), but they arent implemented. so the function most likely looks like this:

whatever function(blah, blah, blah)
{
return NULL;
}

so your program will compile, but it wont work, which is rather annoying…

Try looking at Mark Kilgards OpenGL Programming for the X Windows System (aka the Green Book). It’s probably what you’re looking for. All the examples work with the nVidia drivers under linux.

Sorry if these questions are trivial, but :

  1. is your app currently working with SDL (it seems you’re porting from here …), or glut ?

  2. I doubt you’ll make glXChooseFBConfig work with nvidia hardware. This function is dedicated to X servers running the FB driver (for those machines without text mode, PPC/Mac/Sparc/etc). Maybe you meant glXChooseVisual ?

  1. yes, it is currently working using sdl, but i would like to use glx instead.

  2. ok, but HOW do i use glXChooseVisual, that is what i want to know. im not insisting that i use glXChooseFBConfig, but its the only thing i know how to use. if you can point me to someplace that says how to use glXChooseVisual, i will use that. the only reason i was trying to use glXChooseFBConfig, was because that is what the blue book for openGL 1.2 explained how to use…

If you’re running Linux on an Intel box, you’re not using the FB interface.

See SDL code itself, or GUT (http://www.379.com/gut/), or glut, or [etc]. Together with man pages (bundled with XFree4 dev. packages or available at http://www.xfree86.org/4.1.0/manindex3.html),,) you are quickly done. However it is of course much easier if you have a good knwoledge of X Window architecture.

If you already used wgl under Windows, the path is quite the same :

  • use glXChooseVisual() to fetch your desired visual (think of it as a ‘pixelformat’). This function expect a list of minimal properties you need, here is the classical setup :

#define GLX_ATTR_MAX 128
#define GLX_ATTR_PUSH1(a) attr[attr_cnt++] = a;
#define GLX_ATTR_PUSH2(a,b) { attr[attr_cnt++] = a; attr[attr_cnt++] = b; }
#define GLX_ATTR_END attr[attr_cnt++] = None

XVisualInfo* get_visual (Display* display)
{
int screen;
XVisualInfo *info;
int attr[GLX_ATTR_MAX] = {None};
int attr_cnt = 0;

screen = DefaultScreen (display);

GLX_ATTR_PUSH1 (GLX_RGBA);
GLX_ATTR_PUSH1 (GLX_DOUBLEBUFFER);
GLX_ATTR_PUSH2 (GLX_BLUE_SIZE, 5);
GLX_ATTR_PUSH2 (GLX_RED_SIZE, 5);
GLX_ATTR_PUSH2 (GLX_GREEN_SIZE, 5);
GLX_ATTR_PUSH2 (GLX_DEPTH_SIZE, 16);
GLX_ATTR_END;
return glXChooseVisual (display, screen, attr);
}

  • macros make it more readable
  • GLX_RGBA: indexed mode vs. RGBA hint is mandatory, I suppose you’ll want a RGBA framebuffer
  • GLX_DOUBLEBUFFER: life is boring without this one. Actually expect a triple buffer in most cases, but it’s transparent
  • GLX_{BLUE,RED,GREEN} : yes, looks funny, but works (at least) this way. This ensure 15/16/24packed/32 bit works. Note that you can’t change display depth without restarting XFree, so you’ll always get the current desktop bitdepth
  • see man pages for other thingies (stencil, alpha, stereo, etc)

the XVisualInfo is a struct with a wealth of public info (see <X11/Xutil.h> ). You can actually list available visuals with their GL properties with ‘glxinfo -t’. I got 48 items with my GeForce3.

  • create your rendering window (see XCreateWindow) using the visual in the XVIsualInfo.visual field. Hint: do create a colormap (even if you’re running a DirectColor mode such as 16/24bpp). Other hint : don’t set a background color, otherwise X will redraw exposed parts with bgcolor while you don’t need it if your rendering viewport cover the whole window.

  • create your GL context with glXCreateContext (using the XVisualInfo you obtained previously)

  • attach the context to the window and make it the current one with glXMakeCurrent()

  • don’t forget to call XMapWindow()!

You’re done. Now do GL stuff and call glXSwapBuffer() to [guess what]. Maybe they are other caveats, we’ll just wait you run into them

PS: yuk, what a pain to write text in this small editbox, nobody asked the admin to remove this stupid COLS=45 tag in the HTML code ???

hehe, writing code in these things sucks. thanks a lot though, ill work on it tomorrow (its 12:30 am, so i should probably sleep now ), i dont know why i didnt think to check the man pages, oh well, live and learn…

A really old but probably still useful introduction to OpenGL with Xlib is here http://toolbox.sgi.com/linux/documents/OpenGL/overviews.html#OglXIn

The GLX ports of NeHes lessons is also good and if I remember correct is XFree86 extensions used.

since glXChooseVisual is dicussed here, did anyone get it to work with multisample (FSAA)? It used to work in previous nVidia drivers, but seems to cease working in the 23.13 driver. Exactly opposite to the advertised GLX extensions…

I’ve never used this extension, in my case playing with the __GL_FSAA_MODE env. var is simpler. However AA used to crash quake3 here, and I just ran a succesfull match in quincux mode (__GL_FSAA_MODE=2) on my GeForce3 with the newest 2313 drivers. No noticeable frame rate loss and really smooth edges everywhere

I had a quick look : using glXGetConfig() and querying the GLX_SAMPLE_BUFFERS_ARB and GLX_SAMPLES_ARB attributes, I have 12 out of my 42 available visuals that expose ‘1 buffer and 2 samples’ (others report 0 and 0).

So I expect that if you add the sequence { GLX_SAMPLE_BUFFERS_ARB, 1, GLX_SAMPLES_ARB, 2 } in you glXChooseVisual() attrib list, you’re halfway done. But maybe I’m not saying anything you did’nt know …

I didn’t go further. When you have a visual
supporting AA, do you simply glEnable(GL_MULTISAMPLE_ARB) and expect all primitives to be anti-aliased ? (It’s really a question, I never used this extension !)
I didn’t find documentation on glSampleCoverageARB and glSamplePassARB.

ok, i did everything you suggested, and i get a window to pop up, but nothing gets drawn into it. i know im creating a context, and i also know that it is the current one. the weird thing is, the color that the window is filled with matches my glClearColor, so some gl commands are drawing to the window, but things like vertex calls arent. here is my window initialization code (getVisual is just a copy of what you posted earlier):

Display * disp;
Window xWin;
XEvent event;
XVisualInfo * vInfo;
XSetWindowAttributes swa;
GLXContext context;
int swaMask;
Uint32 first, second;

disp = XOpenDisplay(0);
if(disp == NULL)
{
	fprintf(stderr, "cant open display");
}

vInfo=getVisual(disp);
if(vInfo == NULL)
{
	fprintf(stderr, "cant get visual");
}

swa.border_pixel=0;
swa.event_mask = StructureNotifyMask | KeyPressMask;
swa.colormap = XCreateColormap(disp, RootWindow(disp, vInfo-&gt;screen), vInfo-&gt;visual, AllocNone);
swaMask = CWBorderPixel | CWColormap | CWEventMask;

xWin = XCreateWindow(disp, RootWindow(disp, vInfo-&gt;screen), 0, 0, 800, 600, 0, vInfo-&gt;depth, InputOutput, vInfo-&gt;visual, swaMask, &swa);

// XSelectInput(disp, xWin, StructureNotifyMask);

context = glXCreateContext(disp, vInfo, NULL, True);

glXMakeCurrent(disp, xWin, context);
	
XMapWindow(disp, xWin);

XWindowEvent(disp, xWin, StructureNotifyMask, &event);

while(1)
{
XCheckWindowEvent(disp, xWin, StructureNotifyMask | KeyPressMask, &event);
DrawGL();
glXSwapBuffers(disp, xWin);
}

i know that what is in DrawGL() works, because it did draw things when i used SDL. thanks for any help, this is really bugging me.