Object Selection / glReadPixels in 16 bit format

Alright, I’m using the backbuffer (assigning each object a color and using glreadpixels to get the color, and thus the ID for the object) method to select objects. In order to get around the problem of “non-exact” colors (1.0009 instead of 1.0, etc) I am using unsigned bytes. This code works perfectly when the color depth is set to 32 bit. When I set it to 16 bit, however, I have problems. Here is the code… then I’ll explain:

//Set up scene
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glDisable(GL_TEXTURE_2D);
glDisable(GL_CULL_FACE);

//“1” is the ID of the button
glColor3ub(1, 0, 0);

//Draw the button
glCallList(buttonList);

//Get the red color, thus finding the ID
GLubyte tempColor;

glReadPixels(X, Y, 1, 1, GL_RED, GL_UNSIGNED_BYTE, &tempColor);

Right now, I’m printing the tempColor to the screen for debugging purposes. In 32 bit depth, it’ll print out the correct values anywhere from 0 to 255 … exactly what it’s supposed to do.

In 16 bit depth, however, it will only print the the value if it is a multiple of 8. I.e. if I put glColorub(16, 0, 0)… tempColor will be 16. If I use any other numbers, it doesn’t work.

The funny thing is, I think this was working at one point. But I can’t say for sure… it’s been awhile since I’ve messed with this part of the code. Is there something special I should be doing for 16 bit depth? Is there a setting somewhere that may be affecting this? I’m at wit’s end here. Any help would be much appreciated

[This message has been edited by Cebu (edited 04-10-2001).]

Just a thought, but 16 bit color means that you get about 5 bits per color. That amount to 32 different colors. I am pretty sure that the bit field is multiplied by 8 to get a range from 0-255 which would give you 0,8,16,24,… you get the idea. And if you are using Alpha, you most likely will get only 4 bits for each color which would be 0,16,32,…
This is probably what is stored and why you are not getting back the colors you think you are.

I could be wrong though, it’s late and it’s all I could come up with.

Also there might be dithering issues with 16 bpp. I think that may account for some values working and others not.

Yeah Sheepie… I think you are right. That’s what I figured, but I just wanted to hear it from other people I guess… hehe. 32 colors may or may not be enough for my purposes… hmm not really sure. It’s easy enough to just use red, green and blue though. If I used combinations of all three, that is what… 323232? It’s been awhile since I’ve had statistics. If I’m right and I can get 32768 possible ID’s then that’s definately plenty. And This process would be interchangeable between 16 and 32 bit depth. This is really what I’m striving for - letting the user choose the visual settings for what their system can handle.

Yes, it would be 323232.

You might want to do yourself a really big favor and disable lighting and any blending functions, dithering, anything that will slow down the draw. I am going to be implementing a similar feature in a program I am working on currently, and have been giving thought to the how to do it in 16-bit mode as well. You might also want to limit severely your glReadPixels to as small of an area around the mouse as possible as that can be a sloooooow function.

Also, you may wish to query for the format when your in 16-bit as different display drivers will have different formats. Which color get’s the extra bit and so on. I’m not sure if this will make a difference, but it couldn’t hurt to check.

thats not quite true, using 32 BIT color-depth gives you 8 Red bits, 8 Green bits, 8 Blue Bits and 8 Alpha bits. Without the Alpha Bit you have 255255255 possible ID’s, something like 16777216…

Viper, you may want to note that we are discussing the number of colors in 16-bit mode.