Converting GLint to GLubyte

Hello,

I am using readpixels to gather pixel data. I want to be able to go through the pixel map and test certain pixels against the background color so that that i can make some of the image show the background through it.

The problem I am having is that if I get the clear color using glGetIntegerv it returns a signed integer for the value.

But if I use GL_INT in readpixels, the performance is too slow, GL_UNSIGNED_BYTE works best. I need to be able to convert? the ints returned by glGetIntegerv to GL_UNSIGNED_BYTE.

Unfortunately the application I am working with is pretty big (meaning the color value is used for alot of other areas that require the integer type), so it is not possible to store the color as a unsigned byte first.

Anyone have any ideas?

[This message has been edited by bumby (edited 06-03-2003).]

never mind…

tried to make it more complicated. I just converted it myself…

But it makes me wonder, why would glGetIntegerv(GL_COLOR_CLEAR_VALUE) return a signed number?

It does not return a single value. It returns an array (or vector, hence the ‘v’), containing red, green, blue and alpha clearcolor values.

ok,

then why would it return an ARRAY of signed values?

glGetIntegerv is used for more than returning the clear color, and some of these other values can be negative. In C++, the return type cannot be changed at runtime (it is not a dynamically typed language). Therefore, it must return a signed integer. Much to learn, have you.

I guess I should have been more clear…

I was merely wondering why they dont have something that would return a value in the same format that a value is input.

I mean, if GL requires floats clamped to 0,1 for something like clearcolor, why dont they have something that would return an unsigned value back…

I know that glIntegerv returns a signed int.

What I was trying (and failing) to wonder was why there is no glGetUnsignedIntv or something like that.