View Full Version : MATRIX_PALETTE - normals

07-08-2008, 07:46 AM
I'm implementing vertex skinning using GL_MATRIX_PALETTE. It is mostly running, but I'm having trouble with my normals.

If I run my code without GL_MATRIX_PALETTE (and corresponding code) enabled then my object is correctly rendered. If I enable GL_MATRIX_PALETTE (which loads an identity matrix into palette 0) I get my object drawn in the correct location (i.e. vertices are ok), but part of the object is shown as white instead of the grey color it should be.

My guess is that normalization of the normals is not happening after the new normals are being computed. My starting normals are not normalized, but I do enable GL_NORMALIZE.

So, (finally a question) ... is it true that transformation of the normals by the vertex skinning are not being normalized?

If so, anyone have a suggestion for a fix?

Thanks so much!


07-09-2008, 03:17 AM
Are you talking about http://www.opengl.org/registry/specs/ARB/matrix_palette.txt?

Because there is no GL_MATRIX_PALETTE. There is only GL_MATRIX_PALETTE_ARB.

This extension is not widely supported. You should use GLSL shaders.