View Full Version : glGetUniformIndices always returns invalid index

05-02-2014, 05:34 AM

I'm once again stumbling with glGetUniformIndices function, which constantly returns 0xFFFFFFFF on every index, no matter what the name is.
My function call looks like this:

glGetUniformIndices(shader, count, const_cast<const GLchar**>(&names[0]), indices);

The names array has the names in following form:


which corresponds to every member of block's member. And the block itself in GLSL looks like this: (version 430)

struct lprops
unsigned int type;
vec3 color;
vec3 position;
vec3 target;
float spotlightCosCutoff;
float spotlightExponent;
float constAttenuation;
float linearAttenuation;
float quadAttenuation;

uniform LightProperties
lprops light[MAX_LIGHTS];

The uniform itself is used in code, so the case of it being rejected surely isn't the problem, but to be completely sure I changed MAX_LIGHTS value to check if active uniforms count changes, which confirmed, that everything with this aspect if fine. The shader compiles without any errors, and glGetError returns no error before and after the glGetUniformIndices call.
LightProperties uniform is two shader stages - fragment, and vertex, they are identical. As far as I know default layout for uniforms is shared, so it shouldn't be the problem as well (correct me if I'm wrong).
I'm completely out of ideas what to do, what can be the problem here?

Dan Bartlett
05-03-2014, 04:51 AM
Should it be lights.Light[0].type instead of LightProperties.Light[0].type?

05-04-2014, 02:08 AM
I have read it should point to type, instead of variable itself, anyway I checked if it would work but with no success.