how to do fast 2d graphics?

thanks for the answers. i found another alternative: use aux buffers to store the background and then glCopyPixels() from the aux to the back buffer. will try these now…

[This message has been edited by petschy (edited 06-10-2003).]

Originally posted by petschy:
2. use a texture instead. for a 1024x768 imgage, an 1024x1024 texture would be fine, but
it might / might not be supported on all cards. using smaller textures and splitting up the image is a bit inconvenient, but it should be ok. for arbitrary sized images (non-square, non-power-of-two) though, memory will be wasted this way.

This is most likely going to be your best bet. As for splitting your images, you only need to write a class to do this once. If you do it properly then you should be able to handle everything.

I don’t know what your system requirements for this product are, but video cards have been able to handle 1024x1024 textures for a long time (I know for sure the ATI Rage 128 copes, and I’m pretty sure the NVidia TNT2 copes, so it’d be up to the Voodoos and the Matroxes…)

The ATI Rage Pro copes as well, as do Intel i810 and Matrox G200. Though you wouldn’t really want to target these

Voodoos are dead.

WGL_ARB_buffer_region and KTX_buffer_region extensions are designed for it.

ooops. i accidentally edited the orig message instead of posting a new one… guess there’s no way fixing that

Originally posted by petschy:
[b]thanks for the answers. i found another alternative: use aux buffers to store the background and then glCopyPixels() from the aux to the back buffer. will try these now…

[This message has been edited by petschy (edited 06-10-2003).][/b]

just in case anyone interested:

config:
linux, amd athlon 1700+, nvidia gf2
drawing an 1024x768 image, windowed using
glut. 1600x1200x32 screen.

glDrawPixels() did 87 fps

GL_QUADS w/ texture mapping did 270 fps

glCopyPixels() from AUX0 to BACK did 263 fps, but the image was a bit crappy, and couldn’t fix that.

i guess that a full-screen 1024x768 version should be a bit faster, but i didn’t find yet how to change resolution w/ glut.

so it seems that texture mapping is the way to go (as you suggested). thanks for the replies again.

cheers, p

What?? AUX buffers are supported on nVidia cards? since when?
Can anyone confirm this? Is this linux specific? Which driver version you have?

AUX buffers are always optional.

I think not even the FX5800 has AUX buffers. Dont know about the details of 5900.

Yes, I realize the minimal amount of AUX buffers the GL implementation is required to support equals to zero and indeed I was never able to get pixelformat with >0 AUX buffers on any of my GFs. Thats the cause of my surprise.

However, I don’t think this is related to any HW generation. I believe any HW flexible enough to support RTT in DirectX should be technially able to handle AUX buffers in GL.

This is all sad, as AUX buffers could be useful with WGL_RTT.

Can’t wait for the Süper Büffers…

[This message has been edited by MZ (edited 06-12-2003).]

well, i didn’t check for any aux support, just did a drawpixels() to backbuffer, then copy it to aux0, then when rendering, copy aux0 back to the backbuffer. the edge of the image was massively littered, but it was there… although sometimes (randomly) popped in a weird effect, flipping the image vertically while rendering.
how can i check for aux support?
yes, aux could be supported since the hw supports 2d bitmaps in videoram (dx surfaces for sure). anyway, this can be ‘emulated’ if needed, just have to write a manager that puts all the small images into textures and use them with proper (s,t) when drawing.

cheers, p