neutrino17

05-26-2008, 10:45 AM

In my graphics scene I have a drawn coordinate system (let it be XYZ) that I want to be able to rotate around imaginary axes bounded to camera. XYZ is drawn at the origin and then rotated. So if I specify (1.0, 0.0, 0.0) as vector coordinates to glRotatef call, it will rotate the scene around the current XYZ coordinate. As I understand I have to multiply (1.0, 0.0, 0.0) by inverse projection matrix and then multiply the result by inverse modelview matrix to get coordinates of the axis that coincide with screen X axis. Is that right?