Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 6 of 6

Thread: How do I get my gluLookAt in shaders

  1. #1
    Junior Member Newbie
    Join Date
    Jun 2018
    Posts
    14

    How do I get my gluLookAt in shaders

    Hello,

    As I am learning how to deal with using shaders and my next question is how do I get the following into the shaders?
    Code :
    void Init(void)
    {
    .....
                    glViewport(0,0,ClientWidth,ClientHeight);
    		glMatrixMode(GL_PROJECTION);
     
    		glLoadIdentity();
    		gluPerspective(45.0,ClientWidth/ClientHeight,
    					   0.1,1700);
    		glMatrixMode(GL_MODELVIEW);
    .....
    }
     
    void Render(void)
    {
    ....
    		glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    		glLoadIdentity();
     
    		gluLookAt(FCamera.X,FCamera.Y,FCamera.Z,FEye.X,FEye.Y,FEye.Z,
    			       FUp.X,FUp.Y,FUp.Z);
     
    ....
    }

    If the above snippets, the glView and gluPerspective only change if the window size changes, so they are somewhat fixed(less likely updated). The gluLookAt is in the actual render routine so it is most likely updated the most.

    The big question is what matrix can I pull to send to the shader? I know that I need a uniform to do this, but I am uncertain how to access the current matrix (composite of the above) so that it can be passed to the shader.

    Thanks

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,959
    Quote Originally Posted by williajl View Post
    The big question is what matrix can I pull to send to the shader? I know that I need a uniform to do this, but I am uncertain how to access the current matrix (composite of the above) so that it can be passed to the shader.
    If you're using fixed-function matrix operations, you can access the matrices in the shader using the compatibility uniforms: gl_ModelViewMatrix, gl_ProjectionMatrix, etc (there are also variables for products, inverse, transpose, etc).

    But if you're using shaders exclusively, it's better to use a client-side library (e.g. GLM) to construct the matrix then upload it using glUniformMatrix*(). Don't use glGet() to retrieve matrices other than for debugging, as it can cause a pipeline stall.

  3. #3
    Junior Member Newbie
    Join Date
    Jun 2018
    Posts
    14
    Quote Originally Posted by GClements View Post
    If you're using fixed-function matrix operations, you can access the matrices in the shader using the compatibility uniforms: gl_ModelViewMatrix, gl_ProjectionMatrix, etc (there are also variables for products, inverse, transpose, etc).

    But if you're using shaders exclusively, it's better to use a client-side library (e.g. GLM) to construct the matrix then upload it using glUniformMatrix*(). Don't use glGet() to retrieve matrices other than for debugging, as it can cause a pipeline stall.
    Unfortunately, when I use gl_ModelViewMatrix,gl_ProjectionMatrix or gl_ModelViewProjectionMatrix, the program refuses to compile because it says the global variables was depreciated after 140. Is there something I am missing here?

  4. #4
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,959
    Quote Originally Posted by williajl View Post
    Unfortunately, when I use gl_ModelViewMatrix,gl_ProjectionMatrix or gl_ModelViewProjectionMatrix, the program refuses to compile because it says the global variables was depreciated after 140. Is there something I am missing here?
    You probably need to add a #version directive specifying the compatibility profile, e.g.
    Code :
    #version 330 compatibility

    If a #version directive lacks a profile, the core profile is assumed.

  5. #5
    Junior Member Newbie
    Join Date
    Jun 2018
    Posts
    14
    Quote Originally Posted by GClements View Post
    You probably need to add a #version directive specifying the compatibility profile, e.g.
    Code :
    #version 330 compatibility

    If a #version directive lacks a profile, the core profile is assumed.
    This did the trick for now.

    The next question is, is the algorithm's used to calculate the modelview and projection matrix available so that I can eventually remove the compatibility by creating my own functions and passing in a composite matrix via a uniform?

    Specifically the algorithms used by gl_Perspective/gl_View and gl_LookAt?

    Or, is there a real performance penalty for using compatibility mode in the shaders?

  6. #6
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,959
    Quote Originally Posted by williajl View Post
    The next question is, is the algorithm's used to calculate the modelview and projection matrix available so that I can eventually remove the compatibility by creating my own functions and passing in a composite matrix via a uniform?

    Specifically the algorithms used by gl_Perspective/gl_View and gl_LookAt?
    The gluPerspective and gluLookAt reference pages document the matrices which are used.

    Quote Originally Posted by williajl View Post
    Or, is there a real performance penalty for using compatibility mode in the shaders?
    Probably not. However, the compatibility profile isn't supported on MacOSX (so if you want to use any version after 2.1 there, you're limited to the core profile), and OpenGL ES is based around the core profile.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •