Hi OpenGL community.
I'm fighting with matrices and just discover a point that I don't understand. (As many, I'm a little new to matrix manipulation).
Even if my probkem is not directly related to OpenGL, I'm sure some advanced peoples here should have the answer to this.
I try to project a point to a camera to know his position in screen space.
I have a simple scene:
- A simple point in the center of the world (locator).
- A camera looking at this point.
I create my mvpMatrix this way (pseudo code):
Code :locatorMatrix = locator.getMatrix(worldSpace=True) # because my locator is (0,0,0), this is a simple default matrix here camMatrix = myCam.getMatrix(worldSpace=True) camProjMatrix = myCamShape.projectionMatrix() mvpMatrix = locatorMatrix*camMatrix*camProjMatrix result = mvpMatrix*locator.getTranslation(worldSpace=True) # I multiply my "in the center of the world" point to my matrix print result [0.0, 0.0, 0.0]
This is actually not very surprising no? If if multiply a matrix by any (0,0,0) point, it should always return zero...
But of course my "in the center of the world" point is not at (0,0) in my camera... It is about (0.5, 0.5). This is the point I don't understand. I've done some search on internet to know how do this.
I suppose I've miss something.
I would really be very grateful if someone just point me where my error is...
Have a good day all and thanks in advance!