hey folks,
I’ve got a problem on how to get the array size of a glsl attribute using glGetActiveAttrib
#version 410
in vec4 Position;
in vec4 DummyStuff[5];
in vec4 vSuperDummy[3];
in vec2 vOther[2];
in float fParam0[2];
out vec4 vDummyStuff1;
out vec4 vDummyStuff2;
// Vertex-Shader
void main()
{
vec3 vSomeStuff = vec3( vOther[ 0 ].xy, 1.0 ) + vec3( vOther[ 1 ].xy, 2.0 );
gl_Position = vec4( Position.xyz + vSomeStuff, 1.0 );
vDummyStuff1 = DummyStuff[0];
vDummyStuff2 = DummyStuff[1];
}
When querying the size of any of those array-attributes I get from glActiveAttrib the names like “DummyStuff[0]”,“DummyStuff[1]”, etc. Thus the size parameter always returns 1.
I’m not saying the result is wrong. In fact it’s correct as it is given in terms of the GLenum-type returned by the function… but kind of weird.
Is there any way to find out that DummyStuff is 5 (?) or do I have to parse and accumulate the stuff to find out the “real” array size? (Or am I doing something wrong and somebody can show me a counterexample I didn’t think of)
I’m runnning ATI Radeon HD5870 Win7.
Thanks for any hints
Arrays know the number of elements they contain. This can be obtained by using the length method:
a.length(); // returns 5 for the above declarations
The length method cannot be called on an array that has not been explicitly sized.
You can write loops using the .length(), making your shader code adapting to the actual length of the arrays.
OK, I tested it on a Geforce GTX 480 and NVIDIA doesn’t make life much easier.
The output for the first index gives me the correct size, the second is 1:
DummyStuff[0] size is 5
DummyStuff[1] size is 1
… size is 1
Maybe a catalyst bug