I’m converting everything over to use shaders and I’m blocked by what appears to be a driver bug.
If I read gl_MultiTexCoord0 in my shader, instead of getting proper polygons, I get garbage random vertices only on the geometry that has textures.
Removing the read of gl_MultiTexCoord0 fixes the random vertices.
I have a very simple shader generator right now. Here are the shaders:
Vertex shader:
attribute mat4 mat;
attribute mat4 matproj;
varying vec3 lightnormal;
varying vec4 vertpos;
varying vec2 textureCoord;
void main()
{
mat3 normalmatrix = mat3(vec3(mat[0]),vec3(mat[1]),vec3(mat[2]));
lightnormal = normalmatrix * gl_Normal;
vertpos = mat * gl_Vertex;
textureCoord = gl_MultiTexCoord0;
gl_Position = matproj * gl_Vertex;
}
fragment shader:
varying vec2 textureCoord;
uniform sampler2D textureMap;
varying vec3 lightnormal;
varying vec4 vertpos;
void main(){vec3 lightnorm = normalize(lightnormal);
gl_FragColor = vec4(textureCoord.s, textureCoord.t, 0, 1) * (0.5 + 1.0);
}
They’re “debug” shaders, they simply use the vertex coordinate as the color.
Is there a reason it would “go crazy” if I access gl_MultiTexCoord0?
I’m using a VAO + vbo for vertices + glInterleavedArrays(GL_T2F_N3F_V3F, …) + vbo element buffer + glvertexattribpointer for the mat and matproj (for instancing).
Windows 7 ultimate 64-bit, nvidia version 285 drivers (did same on previous version, 280).
If I try to use VertexAttribPointer for the texture coord (using “attribute vec2 tc”), it stops “going crazy” with garbage triangles, and geometry randomly disappears/appears based on camera location, and I don’t get any proper texture coordinates.
I am using GL_debug_output and I don’t have any errors happening (I get lots of verbose info about some buffers being in system memory / video memory / etc)
I’ve wasted a ridiculous amount of time on what really seems like a driver bug. I am experienced enough to resist that thought and to try to believe that the problem actually is my fault =).
Any help or insights would be GREATLY appreciated!
Thanks.