HIi/Lo part of unsigned int

Hi,
I see there’s no support for unsigned short vectors in glsl… so I am packing two of them in a single unsigned int. But how to get the hi and low part of it once in the shader? Any hint?
Thanks
a.

Might be a silly question but why do you need to use unsigned shorts here? Couldn’t you just use 2 unsigned ints instead? You seem to be making things difficult for yourself, not to mention slowing down your shader by adding more instructions.

Bitwise operators could do it, but I understand that they’re a shader model 4 feature. I’d just use unsigned ints all the way.

I understand that they’re a shader model 4 feature.

So is using unsigned ints in a shader at all.

I am making a VBO to hold a HUGE number of vertices: 4 millions, but may be even more than that.
Each vertex only needs an X and Z coord, and such values range from 0 to 65535 so there’s no need for a full integer or float. I can pack both coords into a single unsigned 32 bits int. Let’s call it “XZ”.

Having 4 million vertices to store, the difference between having 4 or 8 bytes for each vertices is substantial.

Each vertex only needs an X and Z coord, and such values range from 0 to 65535 so there’s no need for a full integer or float. I can pack both coords into a single unsigned 32 bits int. Let’s call it “XZ”.

Then just use glVertexAttribArray. Give it a 2D attribute (2 values), with each value being an unsigned short.

I was already doing that but guess what? I was so sure that the corresponding IN attribute in the shader had to be exactly the same type, that I made all this mess just because an unsigned short type doesn’t exist in glsl. I didn’t simply think that I could input a vertexattribarray of GL_UNSIGNED_INT and have it automatically typecasted to vec2 :\

Thanks :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.