I’m working on a 2D 16-bit game. I’ve got the OpenGL context set to 16-bit and everything is running fine. Except for when I read pixels from the back buffer. When the game is running in a window, the image read from the back buffer with glReadPixels() looks exactly like the back buffer. But when the game is running in fullscreen mode, I’m getting this:
Back buffer (swapped to front buffer):
Image made by glReadPixels() from back buffer:
If you look closely, you’ll notice that the second image doesn’t look exactly like the first image. I’m at a loss at to why this is. I’m guessing it has something to do with the OpenGL context actually being 32-bit when running in a window, since the desktop environment is 32-bit, but is 16-bit when running in fullscreen mode. But the back buffer always looks correct when swapped to the front buffer – it’s just the pixel values read from the back buffer that are off for some reason.
I’m reading the back buffer with:
glReadPixels(XRayCoordinateX, (ResolutionHeight - XRayCoordinateY) - 128, 128, 128, GL_RGBA, GL_UNSIGNED_BYTE, @Pixels);
And then binding the data to a texture with this:
glBindTexture(GL_TEXTURE_2D, TextureID[TextureReference[PlayerXRay]]);
glTexImage2D(GL_TEXTURE_2D, 0, 4, TextureDimX[TextureReference[PlayerXRay]], TextureDimY[TextureReference[PlayerXRay]], 0, GL_RGBA, GL_UNSIGNED_BYTE, @Pixels);
I’ve checked all settings with glPixelTransfer and glPixelStore, and it’s all set to the default values. Anybody have any idea what’s going on?
EDIT: It’s definitely a 16-bit issue. If I set my desktop to 16-bit, it also looks wrong when running in a window.