Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 7 of 7

Thread: Rotate Monitor

  1. #1
    Intern Newbie
    Join Date
    May 2012
    Posts
    30

    Rotate Monitor

    Can't seem to find any info on a situation where you rotate the monitor say 90 degrees. You would obviously have to rotate the viewport by a corresponding 90 degrees. How do I do this?

  2. #2
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,156
    There are a number of ways to deal with this. You could put this rotation in the viewing transform (just change your "up" vector). Or after rasterizing the framebuffer, you could transpose the image.

    Food for thought: Think about whether you want to resize windows when the monitor is rotated (consider the non-square window case) and/or what you're going to do in the case of non-square pixels. Unless square windows and square pixels, you'll probably also need to change the portion of the scene you see through the window and/or the resolution of this window (projection and viewport, respectively) to account for this monitor rotation. This so that you don't see distortion (squeezing/stretching) in the view you see through your window.

  3. #3
    Intern Newbie
    Join Date
    May 2012
    Posts
    30
    Thanks, for anyone who is interested, this thread also has a link that talks about "up" vectors and how to play around with views.

    https://www.opengl.org/discussion_bo...ate-the-camera

  4. #4
    Intern Newbie
    Join Date
    May 2012
    Posts
    30
    From the link the previous post, can someone explain this statement to me?

    "Note that these steps correspond to the order in which you specify the desired transformations in your program, not necessarily the order in which the relevant mathematical operations are performed on an object's vertices. The viewing transformations must precede the modeling transformations in your code, but you can specify the projection and viewport transformations at any point before drawing occurs. Figure 3-2 shows the order in which these operations occur on your computer."

    This is in relation to figures 3-1 and 3-2.

  5. #5
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,156
    Quote Originally Posted by Atomic_Sheep View Post
    From the link the previous post, can someone explain this statement to me?
    Sure. Here's OpenGL's transformation ordering:

    P*V*M * vobject = vclip

    where:

    P = Projection transform
    V = Viewing transform
    M = Modeling transform
    V*M = MODELVIEW transform

    Let's suppose you have several modeling transforms needed to position an object in your scene. That is:

    P*V*(M3*M2*M1) * vobject = vclip

    As you can see, the order that the transforms are conceptually applied to the object is M1, M2, M3, V, P.

    However, if you're using legacy OpenGL (which has its own built-in MODELVIEW and PROJECTION transform state) to build the transforms, the order that you specify the transforms that get multiplied onto the MODELVIEW transform is: V, M3, M2, M1. (NOTE: PROJECTION (P) has its own separate transform state.)

    As you can see, the order of the component MODELVIEW transforms is reversed.

  6. #6
    Intern Newbie
    Join Date
    May 2012
    Posts
    30
    This went a bit over my head to be honest. I obviously don't have enough pieces to the puzzle to understand what you are saying. Why for example did you mention legacy OpenGL? Is that article that I mentioned talking about an old OpenGL?

    vobject is the 4 dimensional homogenous vector of one of the points in our object? Why is it the last operation in your example when in the article example everything starts off with it as per figure 3-2.

    Currently my understanding of that statement is basically that the order in which you specify the various transformation matricies is different to the order in which OpenGL sends commands to the video card. Hopefully at least this last statement is correct and on the right track although I don't yet understand the importance/implication of this yet.

  7. #7
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,480
    Quote Originally Posted by Atomic_Sheep View Post
    This went a bit over my head to be honest. I obviously don't have enough pieces to the puzzle to understand what you are saying. Why for example did you mention legacy OpenGL? Is that article that I mentioned talking about an old OpenGL?
    The linked forum post and the link to the red book in that post both use legacy OpenGL (meaning: features that don't exist in 3.2+ core profile). Modern OpenGL doesn't have glMatrixMode() etc; the application constructs the matrices itself and sends them to the shaders (typically as uniform variables).

    Quote Originally Posted by Atomic_Sheep View Post
    Currently my understanding of that statement is basically that the order in which you specify the various transformation matricies is different to the order in which OpenGL sends commands to the video card. Hopefully at least this last statement is correct and on the right track although I don't yet understand the importance/implication of this yet.
    The legacy matrix functions (glRotate(), glTranslate(), glMultMatrix() etc) multiply the CTM with a relative transformation, with the CTM on the left and the relative transformation on the right.

    If you think in terms of starting with the vertex coordinates in object space and applying a sequence of transformations to those vertices, the right-most transformation (corresponding to the last OpenGL command) is applied first while the left-most transformation (corresponding to the first OpenGL command) is applied last.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •