Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 5 of 5

Thread: How to compute the matrix used by gluLookAt

  1. #1
    Intern Contributor
    Join Date
    Jul 2003
    Posts
    92

    How to compute the matrix used by gluLookAt

    Hi,

    I try to compute the matrix generated by gluLookAt to use it to render my scene.

    I have good results with gluLookAt, but not using the matrix i compute.

    To compute the matrix i use this code:

    Code :
    GLGXMATRIX* GLGXMatrixLookAt( GLGXMATRIX* pResult,GLGXVECTOR3 *pPosition,GLGXVECTOR3 *pTarget,GLGXVECTOR3 *pUp)
    {
    	GLGXVECTOR3 forward = *pTarget - *pPosition;
    	GLGXVec3Normalize(&forward,&forward);
     
    	GLGXVECTOR3  up;
    	GLGXVec3Normalize(&up,pUp);
     
    	GLGXVECTOR3 side;
    	GLGXVec3Cross(&side, &up, &forward);
     
    	pResult->_11 = side.x;
    	pResult->_12 = side.y;
    	pResult->_13 = side.z;
    	pResult->_14 = 0.0f;
     
    	pResult->_21 = up.x;
    	pResult->_22 = up.y;
    	pResult->_23 = up.z;
    	pResult->_24 = 0.0f;
     
    	pResult->_31 = forward.x;
    	pResult->_32 = forward.y;
    	pResult->_33 = forward.z;
    	pResult->_34 = 0.0f;
     
    	pResult->_41 = -pPosition->x;
    	pResult->_42 = -pPosition->y;
    	pResult->_43 = -pPosition->z;
    	pResult->_44 = 1.0f;
     
    	// where matrix._11,..._12,..._13,..._14,..._21... equal matrix[0],...[1],...[2],...[3],...[4],...
     
    	return pResult;
    }
    To render my scene i use:

    Code :
    glMatrixMode( GL_MODELVIEW );
    glLoadIdentity();
     
    /*
    	gluLookAt(this->Scene->Cameras[0].Loc.x,this->Scene->Cameras[0].Loc.y,this->Scene->Cameras[0].Loc.z,	
      	         this->Scene->Cameras[0].Eye.x,this->Scene->Cameras[0].Eye.y,this->Scene->Cameras[0].Eye.z,	
    	         this->Scene->Cameras[0].Up.x, this->Scene->Cameras[0].Up.y, this->Scene->Cameras[0].Up.z);
    */
     
    GLGXMATRIX matrix; // Is Identity
    GLGXMatrixLookAt(&matrix,&this->Scene->Cameras[0].Loc,&this->Scene->Cameras[0].Eye,&this->Scene->Cameras[0].Up);
     
    glPushMatrix(); // Not used with gluLookAt   
     
    		glMultMatrixf(matrix);  // Not used with gluLookAt  
     
    		this->Scene->Lights[0].Set();
     
    		glColor3ub(255,0,255);
    		this->Scene->Render();
     
    glPopMatrix(); // Not used with gluLookAt
    The camera seems to be upside down. I can't find what is wrong. I have tried to reverse the vectors, the cross products, ...
    Is it in rendering ?

  2. #2
    Junior Member Regular Contributor
    Join Date
    Nov 2004
    Location
    San Diego, CA, USA
    Posts
    122

    Re: How to compute the matrix used by gluLookAt

    The man page for gluLookAt tell exactly how the matrix is computed, why don't you just compare with that?

  3. #3
    Intern Contributor
    Join Date
    Jul 2003
    Posts
    92

    Re: How to compute the matrix used by gluLookAt

    I don't know where to find this man page. Is it under Linux ? Is it in a web page ?

    I have tried to use informations from this page:

    http://publib.boulder.ibm.com/infocenter...f/gluLookAt.htm

    They compute again the Up vector. I don't know why, but i have tried to do what they write. And i have the same result.

  4. #4
    Junior Member Regular Contributor
    Join Date
    Jul 2005
    Location
    Berlin, Germany
    Posts
    188

    Re: How to compute the matrix used by gluLookAt

    The up vector is recomputed to guarantee an ortho-normal coordinate system. The up vector provided to gluLookAt isn't necessarily orthogonal to the view vector, but the one used to build the lookat-matrix has to be. Recomputing the up vector is mandatory.

    Here's the code (LsgMatrix_setLookAt) i use to compute a lookat matrix:

    Code :
    e: eye position
    c: lookat center
    u: up vector
     
    Z = normalize(e - c)
    X = normalize(up x Z)
    Y = Z x X
     
    M1 is a matrix whose columns are (X, 0), (Y, 0), (Z, 0) (0,0,0,1)
    M2 is a matrix whose columns are (1,0,0,0), (0,1,0,0), (0,0,1,0), (-e.x, -e.y, -e.z, 1)
     
    The lookat matrix is the product of M1 and M2
     
    L = M1 * M2
    355/113 -- Not the famous irrational number PI, but an incredible simulation!

  5. #5
    Intern Contributor
    Join Date
    Jul 2003
    Posts
    92

    Re: How to compute the matrix used by gluLookAt

    Thanks.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •