Wanted: aux buffer stories

I was thinking of using the aux buffer plus CopyTex(Sub)Image as a fallback plan for some image resampling and pseudo-R2T, which would allow me to avoid both the pbuffer cruft and saving/restoring the back buffer contents that would otherwise be destroyed.

It also occured to me that this would effectively allow “render to (color) texture” while using the default depth/stencil buffers, or subregions. In this respect it’s even more flexible than FBO :stuck_out_tongue:

Unfortunately Tom’s otherwise brilliant GLinfo doesn’t query aux buffer support … so I currently have a solution with a largely unknown level of support.

Specifically, I’m looking for cards/drivers that
a)accelerate contexts with at least one aux buffer (duh!)
b)can properly CopyTexImage2D and CopyTexSubImage2D from the aux buffer to a TEXTURE_RECTANGLE target
c)can properly ReadPixels from the aux buffer

If everything works, take “d)since when” as a bonus task :stuck_out_tongue:

My absolute minimum target hardware is Radeon 7xxx, or Geforce 1, or Intel 865G. Part of the problem is that I sold my Geforce 2MX and my Radeon 7200 long ago, so I won’t be able to check those out.

So, if you have some hands-on experience with aux buffers, I’d appreciate any information you can give.

At the delphi3d.net hw registry, if you look under Implementation specifics -> Framebuffer properties -> Number of auxiliary buffers, you’ll see that it is supported (4 buffers) on GeforceFX and later nVdia cards, in the >=6X.XX drivers.
Earlier ones just expose 0.

As I have no working experience with aux buffers, I can’t help you with b) and c) :wink:

With the newer Radeon >= 9500 cards, I think they should handle it (I never had an ATI card, so no personal experience), but the hw registry doesn’t say so. At least, they support the ARB_draw_buffers extension, where you can specify the AUX buffers for gl_FragData[] outputs of the fragment shaders, so this could be a hint that it runs. Maybe GLinfo doesn’t enumerate all contexts?

Originally posted by ScottManDeath:
At the delphi3d.net hw registry, if you look under Implementation specifics -> Framebuffer properties -> Number of auxiliary buffers, you’ll see that it is supported (4 buffers) on GeforceFX and later nVdia cards, in the >=6X.XX drivers.
Earlier ones just expose 0.

Yikes! Now I see it, too. No idea what I have been looking at earlier … sorry :eek:

With the newer Radeon >= 9500 cards, I think they should handle it (I never had an ATI card, so no personal experience), but the hw registry doesn’t say so. At least, they support the ARB_draw_buffers extension, where you can specify the AUX buffers for gl_FragData[] outputs of the fragment shaders, so this could be a hint that it runs. Maybe GLinfo doesn’t enumerate all contexts?
Not necessarily. It’s perfectly legal for an implementation to not support aux buffers. That’s as true for ARB_draw_buffers as it is for core OpenGL.

Anyway, GLinfo2 displays zero aux buffers. Aux buffer count is a pfd member and ATI’s GL driver doesn’t support aux buffers unless requested – apparently NVIDIA gives you aux buffers all the time. I think GLinfo2 doesn’t ask for aux buffers in its pfd, because if I do, on a Radeon 9800Pro w Cat5.1, I get two.

But this is probably going nowhere. I’ve just checked my Radeon 9200 and it doesn’t give me any aux buffers. I can’t imagine a Radeon 7xxx would if a Radeon 9200 doesn’t. And the lower end stuff is where I’d expect the biggest benefits. I’d trade the bandwidth expense for two framebuffer copies vs the storage space for one aux buffer.

I´m going to try MRT in near future, so I extended my pbuffer wrapper to support AUX buffers. I experienced that nVidia (GF 6800GT, 70.41 “beta”) always seems to report the existence of 4 AUX buffers, either with glGet(GL_AUX_BUFFERS) or with wglGetPixelFormatAttribivARB(). But, if you don´t explicitly ask for AUX buffers in your pixelformat, glReadBuffer() glDrawBuffer() on GL_AUXn will fail with INVALID_OPERATION.
This rather confusing, least to say :slight_smile:

AFAIK, ATI only exposes AUX buffers via pbuffer. Only on R3xx cores (I think).

Originally posted by NitroGL:
AFAIK, ATI only exposes AUX buffers via pbuffer. Only on R3xx cores (I think).
Hmm? :confused:
I just used aux buffer rendering on a Radeon 9800Pro/Cat5.1 a few hours ago. There were definitely no pbuffers involved. Pseudo-

while (kicking)
{
 glDrawBuffer(GL_BACK);
 render_scene();
 glReadBuffer(GL_BACK);
 glCopyTexSubImage2D(GL_TEXTURE_RECTANGLE,...);
 glEnable(GL_TEXTURE_RECTANGLE);
 glDrawBuffer(GL_AUX0);
 draw_quad();        //2x2 box-filter down to aux0
 glReadBuffer(GL_AUX0);
 glReadPixels();     //read downfiltered scene
 glDisable(GL_TEXTURE_RECTANGLE);
 swap();
}

If the driver played any tricks on me and rendered to/read from the back buffer, some subregion would have been clobbered. But it wasn’t, everything was working as expected.

Well, if it works, then that’s awesome. I’ve only used aux buffers for a deferred renderer, never tried them outside of a render texture pbuffer.

I desparately need to have 4 AUX buffers, but both of my machines don’t support AUX buffers at all. My PC has an NVIDIA Quadro4 that says 0 AUX buffers when I query it. Our target computer has an Intel Brookdale-G, also with 0 AUX buffers. So I created 4 buffers in memory and am doing glReadPixels and glDrawPixels. Works great on the NVIDIA but not on the Intel.

I’m extremely new to OpenGL, so could somebody suggest an effiicient, speedy, reliable alternate to AUX buffers? Something that’s better than the glReadPixels and glDrawPixels.

Thanks
-bicycleBrain