Matrix error

Hi,
Why these matrices are not equal?

////////////////////////////////
// Matrix1 
///////////////////////////////

glMatrixMode(GL_TEXTURE);
glActiveTexture(GL_TEXTURE4);

const GLdouble bias[16] = {	
	0.5, 0.0, 0.0, 0.0, 
	0.0, 0.5, 0.0, 0.0,
	0.0, 0.0, 0.5, 0.0,
        0.5, 0.5, 0.5, 1.0};

	glLoadIdentity();	
	glLoadMatrixd(bias);
	glMultMatrixd (projection);
	glMultMatrixd (modelView);

        glGetDoublev(GL_TEXTURE_MATRIX, Matrix1);


////////////////////////////////
// Matrix2 
///////////////////////////////

//Functions

void multMatrix(double *resMat, double *aMatrix)
{
	double *a, *b, res[16];
	a = resMat;
	b = aMatrix;

	for (int i = 0; i < 4; ++i) {
		for (int j = 0; j < 4; ++j) {
			res[j*4 + i] = 0.0f;
			for (int k = 0; k < 4; ++k) {
				res[j*4 + i] += a[k*4 + i] * b[j*4 + k]; 
			}
		}
	}
	memcpy(a, res, 16 * sizeof(float));
}

...

const GLdouble bias[16] = {	
	0.5, 0.0, 0.0, 0.0, 
	0.0, 0.5, 0.0, 0.0,
	0.0, 0.0, 0.5, 0.0,
        0.5, 0.5, 0.5, 1.0};


	for(int ts=0;ts<16;ts++)
		Matrix2[ts] = bias[ts];

	multMatrix(Matrix2,projection);
	multMatrix(Matrix2,modelView);

Then -> Matrix2 != Matrix1


Why is my question?

Because if I use the texture Matrix, my shader is perfect! 

ShadowCoord = gl_TextureMatrix[4] * gl_Vertex; // This is OK.

But if I use the Uniform variable not working.

ShadowCoord = Matrix2 * gl_Vertex; // Bad shadown.

Any suggestions?

Thank you, best regards.

I would suggest using a proper math library instead of home-growing one.

In any case, you cannot do matrix multiplication in place like that. You are overwriting values that you will be using later.

You copy only half of your result matrix “res” back to parameter “resMat” (assuming sizeof(double) = 8, sizeof(float) = 4).

Yes mbentrup, I solved it, thank you!!!
Best regards!