View Full Version : Using attributes as indices into a matrix array

08-04-2004, 09:56 AM
Hi there

Certainly this is a simple question, but i donīt know any further. All documents i found were so vague about attributes and arrays, that i am not very happy.

Actually i am trying to do matrix skinning. So i have my arrays of matrices and now i need an index to use one of them. I tried glVertexAttrib1s, which uses shorts and of course i used a short in my shader. But then i got an "short is a reserved word" error. So i used an int in my shader, but got a "int and bool are not allowed for attributes" error. So, desperatly, i used a float and glVertexAttrib1f, but of course i got an "floats are not allowed for indexing" error.

Very nice. I have to use shorts, because VertexAttrib1s is the only function which can send an integer type to my card, but ints and shorts are not allowed as attributes in a shader.

Is it really that stupid, or did i miss something (hopefully) ?

BTW: I use Catalyst 4.7 on a Radeon 9600XT.


08-04-2004, 10:10 AM
You can always cast the float into an int in a temporary. This will likely compile to nothing, as most cards don't really support ints directly.

08-04-2004, 11:01 AM
Cool, it works!

Thanks man.

Oh, and sorry, i thought i had posted this in the shading language forum, where it would belong.