subkey

04-20-2012, 07:41 PM

I have a cube that I am successfully rotating by mouse drags with the following algorithm:

1) Initialize rotation matrix to identity

2) Multiply world-space y-axis (0, 1, 0) by inverted rotation matrix to get a vector in object space corresponding to the y-axis. Then rotate a bit around this vector given the mouse drags.

3) Same thing for x-axis.

4) Each update, apply rotation matrix to object.

This works fine, I am just having trouble understanding the theory behind why you can get the y-axis in object space by multiplying the y-axis in world space by the inverted rotation matrix. Can anyone explain this in very simple terms - I must be missing a key concept.

Thanks!

1) Initialize rotation matrix to identity

2) Multiply world-space y-axis (0, 1, 0) by inverted rotation matrix to get a vector in object space corresponding to the y-axis. Then rotate a bit around this vector given the mouse drags.

3) Same thing for x-axis.

4) Each update, apply rotation matrix to object.

This works fine, I am just having trouble understanding the theory behind why you can get the y-axis in object space by multiplying the y-axis in world space by the inverted rotation matrix. Can anyone explain this in very simple terms - I must be missing a key concept.

Thanks!