Degraded image with glReadPixels from back buffer

I’m working on a 2D 16-bit game. I’ve got the OpenGL context set to 16-bit and everything is running fine. Except for when I read pixels from the back buffer. When the game is running in a window, the image read from the back buffer with glReadPixels() looks exactly like the back buffer. But when the game is running in fullscreen mode, I’m getting this:

Back buffer (swapped to front buffer):

Image made by glReadPixels() from back buffer:

If you look closely, you’ll notice that the second image doesn’t look exactly like the first image. I’m at a loss at to why this is. I’m guessing it has something to do with the OpenGL context actually being 32-bit when running in a window, since the desktop environment is 32-bit, but is 16-bit when running in fullscreen mode. But the back buffer always looks correct when swapped to the front buffer – it’s just the pixel values read from the back buffer that are off for some reason.

I’m reading the back buffer with:



glReadPixels(XRayCoordinateX, (ResolutionHeight - XRayCoordinateY) - 128, 128, 128, GL_RGBA, GL_UNSIGNED_BYTE, @Pixels);


And then binding the data to a texture with this:



glBindTexture(GL_TEXTURE_2D, TextureID[TextureReference[PlayerXRay]]);
glTexImage2D(GL_TEXTURE_2D, 0, 4, TextureDimX[TextureReference[PlayerXRay]], TextureDimY[TextureReference[PlayerXRay]], 0, GL_RGBA, GL_UNSIGNED_BYTE, @Pixels);


I’ve checked all settings with glPixelTransfer and glPixelStore, and it’s all set to the default values. Anybody have any idea what’s going on?

EDIT: It’s definitely a 16-bit issue. If I set my desktop to 16-bit, it also looks wrong when running in a window.

If I force my graphics card to set all textures to 32-bit, the problem is gone. But I can’t figure out how to only have one texture be 32-bit in a 16-bit OpenGL context. When I try to force it with GL_RGBA8, the texture with the data read from the back buffer just becomes white…?

Seems like I just had to set the internal format of the texture to GL_RGBA8 and leave the rest as GL_RGBA. It’s certainly working now.

Funny how just asking a question sometimes presents the solution. I’ve been struggling with this for a couple of months now, but only a few hours after asking about it, I find the solution. :smiley:

Glad you found out.

By the way, why do you have to use 16bit mode ?

I don’t have to, but I like the dithered aesthetic and the game is isometric like the good 'ol Fallouts and Syndicate, so it’s already somewhat retro. It also means it’ll be able to run on quite low-end computers, since 16-bit instead of 32-bit means I’ve pretty much halved the data load transferred to the GPU and stored in video RAM.

I understand the artistic aspect, but on the low-end computer performance side, do not expect too much difference between 32 and 16 bit modes …

If you can run a benchmark, I would be interested to know whether you could measure anything significant.

Well, I got quite a speed improvement when going from 32 to 16-bit graphics, so there is a definite difference on low-end computers. I was starting to get slowdowns on my target computer (an EeePC 901 set to power-saving mode) before the switch, but with 16-bit, I still have a bit of wiggle room. I don’t have any numbers to show you, though, but rendering a full screen is definitely faster with 16-bit (i.e. my program has more sleep time while waiting to draw the next frame, since I’m using a fixed framerate).

thanks