Starman – it sounds like you’ve already sorted out your camera function, but I thought I’d tell you about this method so you can keep it in mind for future projects.
I can see what effect you’re after and why you’re analysing the modelview matrix to feed into gluLookAt. I agree that in this context it IS a good idea. (My discussion earlier was based on an assumption that you’re moving the camera with gluLookAt + translation + rotation and then trying to reverse engineer the movemements rather than precompute them and pass through to gluLookAt).
Anyway, the idea I wanted to tell you about was… arguably a “more elegant” way of getting back appropriate values. This is the way I’d do it if I was setting up the system, anyway.
Suppose you set up your geometry like this:
glLoadIdentity();
renderSun();
glRotatef(earthorbitrotation, 0.0, 1.0, 0.0);
glTranslatef(earthorbitdistance, 0.0, 0.0);
renderEarth();
glRotatef(moonorbitrotation, 0.0, 1.0, 0.0);
glTranslatef(moonorbitdistance, 0.0, 0.0);
renderMoon();
where renderSun() et al all draw an insanely big sphere around the origin of the local coordinate system. You can see how the transformation stack so that the moon orbits the earth and the earth orbits the sun.
Now, if i wanted to set up the camera system to be at the moon looking at the sun, then one way—which you’ve sorted out—is to grab the opengl transformation matrix and extract the world… uh, universe position of the moon from the fourth column. But if you think about what you’re REALLY doing is working out how the origin of the moon’s local coordinate system is being mapped by the matrix stack, right? So, what you’re effectively finding is
m11 m12 m13 m14 0
m21 m22 m23 m24 * 0
m31 m32 m33 m34 0
m41 m42 m43 m44 1
where mij is the modelview matrix when we’re rendering the moon. The location turns out to be the fourth column since the first three columns are obliterated by the mul by 0. The reason why I’m pointing THIS out is because you can then think about DIFFERENT transformations. Suppose you want the view of a person standing at so many degrees latitude and so many degrees longitude on the moon’s surface looking at the sun… what do you do then? you can’t simply rip the fourth column out of the matrix because we’re no longer talking about the origin of the moon.
The answer is we can compute a new point in the moon’s coordinate system and multiply that the matrix and plug the answer into gluLookat. I can’t remember which way longitude/latitude work out, but suppose the first is rotate about y and the second is rotate about x (for argument’s sake), then you could do this:
pos=M * Ry * Rx * T * [ 0 0 0 1]'
where M is the moon modelview matrix from above, Ry is the rotate about y degrees longitude matrix, Rx is the rotate about x degrees latitude matrix, T is the translation matrix with the vector <0, 0, sealevel> (to shift the observer to some point on the moons sealevel (if it had a sea)) and 0 is the origin. Actualy, we could just axe the T and the 0, 0, 0, 1 and replace it by 0, 0, sealevel, 1… but i’m confident you can see that.
Oh, and something else I didn’t mention… all this matrix multiplication is why people like keeping track of their own modelview matricies in s/w so you can precompute this kind of stuff without using opengl and reading it back from the pipe (which can be slow). So… that multiplication above would be written in my program by just
Point3D pos=M*Geometry::RotateY(y)*Geometry::RotateX(x)*Vector3D(0.0, 0.0, z);
where M is built in a similar way but for the moon transform.
Does that make sense?
cheers
John