I am using readpixels to gather pixel data. I want to be able to go through the pixel map and test certain pixels against the background color so that that i can make some of the image show the background through it.
The problem I am having is that if I get the clear color using glGetIntegerv it returns a signed integer for the value.
But if I use GL_INT in readpixels, the performance is too slow, GL_UNSIGNED_BYTE works best. I need to be able to convert? the ints returned by glGetIntegerv to GL_UNSIGNED_BYTE.
Unfortunately the application I am working with is pretty big (meaning the color value is used for alot of other areas that require the integer type), so it is not possible to store the color as a unsigned byte first.
Anyone have any ideas?
[This message has been edited by bumby (edited 06-03-2003).]
glGetIntegerv is used for more than returning the clear color, and some of these other values can be negative. In C++, the return type cannot be changed at runtime (it is not a dynamically typed language). Therefore, it must return a signed integer. Much to learn, have you.