VBO colour data

Hi folks,

I think I’m not quite understanding the bits and bytes stuff when it comes to storing colour data. I’m trying to store raw data from an image file in a vbo. I’m avoiding using a texture object.

I’m collecting the image data via a std::vector< unsigned char > container. The colour data is good, I can print random values, which are correctly returned in their 0-255 colour range. I’m sending the container with:


// note: container in this case is a reference, as the data is collated elsewhere and passed to my "make buffer function" via a reference
// ..glGenBuffers() etc here..
glBufferData(GL_ARRAY_BUFFER,container->size() * sizeof(unsigned char),container->data(),GL_STATIC_DRAW);

My vertex attribute pointer in the draw function looks like this:


// id is good
glEnableVertexAttribArray(id);
glVertexAttribPointer(id,3,GL_UNSIGNED_BYTE,GL_FALSE,sizeof(unsigned char)*3,0);

I send the info to the shader using an ivec3:


// in my shader
in ivec3 ColourIn_int;
out vec4 colour;
...

void main()
{
...
colour.x = ColourIn_int.x / 255.0;
...etc...
}

The issue I’m getting is that the colour drawn with my shader is not correct. I have to fiddle around with large numbers (e.g. 4+billion) before I start to get anything that remotely resembles the given image. Dividing by 255.0 in this case doesn’t give the desired results.

I don’t appear to be getting any errors with glGetError() anywhere. Is there anything else I can provide?
Where/what am I tripping up with here?

WP.

I’ve just noted that glVertexAttribPointer should probably be glVertexAttribIPointer. Hadn’t realised there was more than one of them!

Part way there…

WP.

EDIT: this has made a significant difference in being able to see the image straight up, but the colours still aren’t correct.

OK, solved. After many days being unable to see the forest through the trees, apparently talking to myself here helped me see the light! My apologies.

The issue was, I needed glVertexAttribIPointer, and I needed to use GL_UNSIGNED_BYTE as the flag. I had accidently used GL_BYTE which had come about through testing other flags in the original glVertexAttribPointer, and was why I was getting incorrect colours as noted in the second post above.

For others on the learning boat, there’s two types of pointer attributes:
glVertexAttribPointer <-- for data with floats
glVertexAttribIPointer <-- for data with chars/ints/longs (my interpretation, correct me if I’m wrong someone!)

Maybe of some help to someone else in the future.

WP.

There’s actually three types:

  • glVertexAttribPointer() is for attributes which are accessed via a single-precision floating-point variable (scalar, vector or matrix) in the shader. The source data can be floating point, integers which are normalised (converted to floats in the range 0…1 or -1…1) or integers which are converted directly to floats.

  • glVertexAttribIPointer() (OpenGL 3.0 and later) is for attributes which are accessed via a (signed or unsigned) integer variable in the shader. The source data must be integers.

  • glVertexAttribLPointer() (OpenGL 4.1 and later) is for attributes which are accessed via a double-precision floating-point variable in the shader. The source data must be double-precision floating point.

Well there you go, there’s three! Thanks GClements!

WP.

Actually for your data set you can use plain old glVertexAttribPointer but set the normalized parameter to GL_TRUE: this will result in the source 0…255 range data being converted to a 0…1 floating point range when it is accessed, and your shader will then declare the colour attrib as vec3.

This is effectively how the old fixed-pipeline glColorPointer worked.

The only thing to be aware of is that GPUs really don’t like 3-component colours and using 3 components may even drop you to software emulation on some platforms. You’d be strongly advised to burn an extra byte in exchange for the performance. Further reading: Common Mistakes: Deprecated - OpenGL Wiki