PDA

View Full Version : Why does a line slant when the camera moves



geohoffman49431
08-10-2004, 11:20 PM
I have a camera set up to move through a scene. This camera can rotate around the x axis (look up and down), or can rotate around the y axis (look left and right), and also move back and forth or strafe left or right.

Lets say I have a line that goes straight up from (0,0,0) to (0,10,0). When the camera is not rotated around the x axis, in other words when it is looking straight ahead and not up or down, the line appears parellel to the sides of the screen. When I look up or down the line begins to slant. Can anyone explain this phenomenon? How can I calculate the amount the line is slanting?

I need to be able to take a point like (0,15,0) and draw a line to the 'ground' and have it appear parallel to the sides of the screen. So for example, maybe my camera has a 45 degree rotation around the y axis (left/right) and a 30 degree rotation around the x axis (up/down), how can I draw a line from (0,15,0) to the ground and have it look parallel to the screen? What would the coordinates be for the bottom point on the line? How do I calculate something like that?

Any insight into this would be appreciated.

08-11-2004, 12:13 PM
geo, im thinking of the case when you look straight up or down. in these cases, the line could become a point. maybe im just having trouble visualizing your problem. what is it youre trying to do? why the need for this parallel edge line?

:)

ZbuffeR
08-12-2004, 10:40 AM
With a perspective projection, if a vertical line is not horizontally centered, it will 'slant' (what does mean this word actually ?), it can only be parallel to the edges of the screen when you look at the horizon.

That's perspective.
You reduce a bit this by reducing the field of view of the camera.

More details about what you want to do with this 'parallel' line ?

dorbie
08-12-2004, 11:21 AM
This is strange, about the only explanation I can come up with is this that you are rendering much faster than the refresh rate of your monitor. Let's say you draw a line at 600 fps and the monitor refreshes at 60Hz. This means that each vertical sweep of your monitor's raster scanline you render 10 different frames of the line animation each with a slightly different position for the line. The resulting display will produce a single image on the glowing phosphor where the top of the screen has the oldest frame and the bottom of the screen shows the newest frame and the regions in between will show a progression of line positions. This will make it appear like the line is sloped as it moves across the screen.

EDIT: scratch that explanation, I misread you, zbuffer is right, this is just a perspective effect. Parallel vertical lines converge to a vanishing point, it is what happens in perspective projection and is quite correct.

http://en.wikipedia.org/wiki/Linear_perspective

The easiest way to avoid this would be to avoid rotating the eye up or down and instead pitch the eye using an asymmetric frustum adjusting the top & bottom of the frustum to fake pitch. You'd never be able to look straight up, and it is technically incorrect but it would match the two point perspective used by architects etc w.r.t. vertical lines.