Hello everybody. I’m now learning mathematical basics of perspective projection and camera positioning. I want to make my own gluLookAt function, which I did, but it is not doing the task it should do. So a few questions about this.
Firstly OpenGl has model-viev matrix known as current vertex transformation: CT = V*M; To add some affine transformation to vertexes I do this: CT = CT * MNew - which works fine in my program. But how do I add new camera transformation? Is it done in this way: CT = VNew * CT? How do I need to apply my new camera transformation, setted by matrix V? Doing so in my program it functions somewhat strange…
If apply new transformation like this:
TVector3D eye = {0, 0, 1};
TVector3D look = {0, 0, 0};
TVector3D up = {0, 1, 0};
LookAt(eye, look, up);
- my program results in simple picture zooming.
if apply this:
TVector3D eye = {0, 0, -1}; //notice sign -
TVector3D look = {0, 0, 0};
TVector3D up = {0, 1, 0};
LookAt(eye, look, up);
the new image is turned in some different way, it is like it was mirrored, turned in 90 degrees and zoomed out a little bit.
Here is the code for my function LookAt:
void LookAt(TVector3D eye, TVector3D look, TVector3D up)
{
//here i'm calculating this:
//n = eye - look;
//u = up x n;
//v = n x u;
TVector3D n, u, v;
Diff(eye, look, n);
VectMult(up, n, u);
Normalize(&n);
Normalize(&u);
VectMult(n, u, v);
float x, y, z, c;
//Now i'm calculating this: CT = V * CT;
x = u.x * CT[0][0] + u.y * CT[1][0] + u.z * CT[2][0];
y = u.x * CT[0][1] + u.y * CT[1][1] + u.z * CT[2][1];
z = u.x * CT[0][2] + u.y * CT[1][2] + u.z * CT[2][2];
c = u.x * CT[0][3] + u.y * CT[1][3] + u.z * CT[2][3] - (eye.x * u.x + eye.y * u.y + eye.z * u.z);
CT[0][0] = x;
CT[0][1] = y;
CT[0][2] = z;
CT[0][3] = c;
x = v.x * CT[0][0] + v.y * CT[1][0] + v.z * CT[2][0];
y = v.x * CT[0][1] + v.y * CT[1][1] + v.z * CT[2][1];
z = v.x * CT[0][2] + v.y * CT[1][2] + v.z * CT[2][2];
c = v.x * CT[0][3] + v.y * CT[1][3] + v.z * CT[2][3] - (eye.x * v.x + eye.y * v.y + eye.z * v.z);
CT[1][0] = x;
CT[1][1] = y;
CT[1][2] = z;
CT[1][3] = c;
x = n.x * CT[0][0] + n.y * CT[1][0] + n.z * CT[2][0];
y = n.x * CT[0][1] + n.y * CT[1][1] + n.z * CT[2][1];
z = n.x * CT[0][2] + n.y * CT[1][2] + n.z * CT[2][2];
c = n.x * CT[0][3] + n.y * CT[1][3] + n.z * CT[2][3] - (eye.x * n.x + eye.y * n.y + eye.z * n.z);
CT[2][0] = x;
CT[2][1] = y;
CT[2][2] = z;
CT[2][3] = c;
}
//-------------------------------------------------------------
void Diff(TVector3D &eye, TVector3D &look, TVector3D &result)//finds vector difference
{
result.x = eye.x - look.x;
result.y = eye.y - look.y;
result.z = eye.z - look.z;
}
//-------------------------------------------------------------
void VectMult(TVector3D &a, TVector3D &b, TVector3D &result)
{//finds vectors vector multiplication
//|i j k |
//|a.x a.y a.z|
//|b.x b.y b.z|
result.x = a.y * b.z - b.y * a.z;
result.y = -(a.x * b.z - b.x * a.z);
result.z = a.x * b.y - b.x * a.y;
}
//-------------------------------------------------------------
void Normalize(TVector3D *vect)
{
float len = Sqrt(vect->x * vect->x + vect->y * vect->y + vect->z * vect->z);
vect->x /= len;
vect->y /= len;
vect->z /= len;
}
Please help! What I’m doing wrong?