Ati radeon and glx 1.3?

Is it possible to support glx 1.3 with such cards ? It appears not, however I’m not experienced with ATI cards.

However, as told in the glx spec, only glx 1.3 is gl 1.2 compliance, so …

No pbuffer possibility ? (however it seems ATI provides drivers and demos in which one is a pbuffer test ?!)
How to support glx pbuffer correctly ?

In fact, I was estonished my code didn’t compile on such cards, even with the ati drivers (appearly).

Thanks for any information about this.

Hello!

Yer, ATI have a readme file regarding their support for pixel buffers. Apparantly SGI didn’t release the GLX specification for GLX 1.3 and so ATI can’t “officially” support pixelbuffers. I don’t know how nVidia manage it, though.

Anyway. Although ATI’s GLX won’t list pixelbuffers in their GLX extension registry, they still expose pixelbuffer function hooks. It’s magic, surely. I have found, however, that their support has issues. Seriously, it’s quite strange. I’ve had to add a test in my code to see if I’m using an ATI card and do some things differently if I am, for example

  • AGLX expects the token None to terminate their GLX pixelbuffer list; GLX_NONE doesn’t work. Some code i’ve seen, and mine initially, use GLX_NONE. I can’t remeber exactly what happens… I think AGLX can’t find any compatible pixel formats if you use it. Both None and GLX_NONE work on nGLX.

  • I can’t remember the story here, either, but here is a code fragment for creating a new context (compliant == true if using a non-Radeon card)

  if(compliant) {
    context=glXCreateNewContext(display, config, GLX_RGBA_TYPE, share, true);
    assert(context);
  } else {
    /* bugged Radeon drivers =( */
    XVisualInfo *visinfo=glXGetVisualFromFBConfig(display, config);
    assert(visinfo);
    context=glXCreateContext(display, visinfo, share, GL_TRUE);
    assert(context);
    XFree(visinfo);
  }
  • glXMakeCurrent’s return code is crazy. It seems to return false even if the change is made. Code fragment #2:
bool ok=glXMakeCurrent(display, pbuffer, context);
  assert((compliant && ok) | | !compliant);
  • querying pixel buffer dimensions just doesn’t work. It returns nonsense values. I submit code fragment #3:

#ifndef VIEWPORT_SIZE_HACK
Display *currdisplay=glXGetCurrentDisplay();
GLXDrawable currdrawable=glXGetCurrentDrawable();

glXQueryDrawable(currdisplay, currdrawable, GLX_WIDTH, &w);
glXQueryDrawable(currdisplay, currdrawable, GLX_HEIGHT, &h);
#else
/* … so we query the drawable’s size instead */
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);
w=viewport[2];
h=viewport[3];
#endif

Now, I say that ATI have it wrong–rather than nvidia–because I wrote my pixelbuffer class using AIX’s GLX specificaiton. The code also works on my nVidia card–it just has issues (and hence, needs the checks to get around them) when I run the code on my ATI card at Uni.

So, the short story is: yes, ATI support pixelbuffers even though they’re not there, but be aware that they have quirks.

enjoy!

cheers,
John

Well, fine !!

However, you seem to have a glx FBConfig for ATI cards ? My code crashes on those cards, first because GLXFBConfig wasn’t supported. In fact, it seems declared, but nothing in it.

So, you’ll need to use glXMakeCurrent not glXMakeContextCurrent ?

Thanks again !!

best regards,

JiDe.

Originally posted by john:
[b]Hello!

Yer, ATI have a readme file regarding their support for pixel buffers. Apparantly SGI didn’t release the GLX specification for GLX 1.3 and so ATI can’t “officially” support pixelbuffers. I don’t know how nVidia manage it, though.
…[/b]

Well, I have the glx spec (It’s about one year old even) that could be downloadable on these pages.
Glx 1.3 only is told to be gl 1.2 compliancy.

On Nv cards, this works like the official paper specs even if it seems not free.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.