Binding integer vertex attributes. Question.

Hello.

Everybody knows, that OpenGL allows us to bind enumerable integer per-vertex attribs and it wiil treat it as floats in vertex shader.

That is very useful for packing, for example, everybody stores per-vertex colors as unsigned bytes, somebody even packs normals in bytes.

There is such a parameter in the description of glVertexAttribPointer(), called ‘normalize’. It handles treatment of integer: should it be mapped to real [0…1] interval, or leave it as is.

But when I want to bind short integer as an attribute, perfomance infinitely drops down. So, it is clear, that nVidia driver does this remapping by it’s hands using CPU power, not on the GPU.
O’key, I said, let’s discover this. And I actually got, that GL_UNSIGNED_SHORT with normalization on/off is done in sofware, GL_SIGNED_SHORT with normalization on is also done in software, and ONLY GL_SIGNED_SHORT with normalization off is done on the GPU without any drop-downs!

So, anybody has an advice, can I believe in the happy future? I don’t think, that is is so hard to implement it, because UNSIGNED_BYTEs are mapped fast.

Thanks in advance!

!! UP !!

Sorry, but I really need to know, what types of integer attributes exactly can I use.

This is implementation dependent. Consult the various repositories of information for the various IHVs.

Thanks, Korval!