PDA

View Full Version : Rotate Monitor



Atomic_Sheep
10-06-2017, 12:39 AM
Can't seem to find any info on a situation where you rotate the monitor say 90 degrees. You would obviously have to rotate the viewport by a corresponding 90 degrees. How do I do this?

Dark Photon
10-06-2017, 06:55 AM
There are a number of ways to deal with this. You could put this rotation in the viewing transform (just change your "up" vector). Or after rasterizing the framebuffer, you could transpose the image.

Food for thought: Think about whether you want to resize windows when the monitor is rotated (consider the non-square window case) and/or what you're going to do in the case of non-square pixels. Unless square windows and square pixels, you'll probably also need to change the portion of the scene you see through the window and/or the resolution of this window (projection and viewport, respectively) to account for this monitor rotation. This so that you don't see distortion (squeezing/stretching) in the view you see through your window.

Atomic_Sheep
10-06-2017, 10:10 PM
Thanks, for anyone who is interested, this thread also has a link that talks about "up" vectors and how to play around with views.

https://www.opengl.org/discussion_boards/showthread.php/178047-about-gluLookAt-function-and-how-to-rotate-the-camera

Atomic_Sheep
10-07-2017, 09:46 PM
From the link the previous post, can someone explain this statement to me?

"Note that these steps correspond to the order in which you specify the desired transformations in your program, not necessarily the order in which the relevant mathematical operations are performed on an object's vertices. The viewing transformations must precede the modeling transformations in your code, but you can specify the projection and viewport transformations at any point before drawing occurs. Figure 3-2 shows the order in which these operations occur on your computer."

This is in relation to figures 3-1 and 3-2.

Dark Photon
10-08-2017, 05:25 PM
From the link the previous post, can someone explain this statement to me?

Sure. Here's OpenGL's transformation ordering:


P*V*M * vobject = vclip


where:


P = Projection transform
V = Viewing transform
M = Modeling transform
V*M = MODELVIEW transform


Let's suppose you have several modeling transforms needed to position an object in your scene. That is:


P*V*(M3*M2*M1) * vobject = vclip


As you can see, the order that the transforms are conceptually applied to the object is M1, M2, M3, V, P.

However, if you're using legacy OpenGL (which has its own built-in MODELVIEW and PROJECTION transform state) to build the transforms, the order that you specify the transforms that get multiplied onto the MODELVIEW transform is: V, M3, M2, M1. (NOTE: PROJECTION (P) has its own separate transform state.)

As you can see, the order of the component MODELVIEW transforms is reversed.

Atomic_Sheep
10-24-2017, 11:53 PM
This went a bit over my head to be honest. I obviously don't have enough pieces to the puzzle to understand what you are saying. Why for example did you mention legacy OpenGL? Is that article that I mentioned talking about an old OpenGL?

vobject is the 4 dimensional homogenous vector of one of the points in our object? Why is it the last operation in your example when in the article example everything starts off with it as per figure 3-2.

Currently my understanding of that statement is basically that the order in which you specify the various transformation matricies is different to the order in which OpenGL sends commands to the video card. Hopefully at least this last statement is correct and on the right track although I don't yet understand the importance/implication of this yet.

GClements
10-25-2017, 03:01 AM
This went a bit over my head to be honest. I obviously don't have enough pieces to the puzzle to understand what you are saying. Why for example did you mention legacy OpenGL? Is that article that I mentioned talking about an old OpenGL?
The linked forum post and the link to the red book in that post both use legacy OpenGL (meaning: features that don't exist in 3.2+ core profile). Modern OpenGL doesn't have glMatrixMode() etc; the application constructs the matrices itself and sends them to the shaders (typically as uniform variables).


Currently my understanding of that statement is basically that the order in which you specify the various transformation matricies is different to the order in which OpenGL sends commands to the video card. Hopefully at least this last statement is correct and on the right track although I don't yet understand the importance/implication of this yet.
The legacy matrix functions (glRotate(), glTranslate(), glMultMatrix() etc) multiply the CTM with a relative transformation, with the CTM on the left and the relative transformation on the right.

If you think in terms of starting with the vertex coordinates in object space and applying a sequence of transformations to those vertices, the right-most transformation (corresponding to the last OpenGL command) is applied first while the left-most transformation (corresponding to the first OpenGL command) is applied last.