Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 2 of 2 FirstFirst 12
Results 11 to 20 of 20

Thread: Getting Worldspace Coordinates from Stored Coordinates

  1. #11
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    6,003
    Quote Originally Posted by Geklmin View Post
    Bad news

    Turns out projection matrices are non-invertible

    using glm::inverse gave me a bad matrix when I called it on the projection matrix and the viewprojection matrix
    I'm curious as to how you know that. If you get a "bad matrix" from the inverse of a projection matrix, you should be able to easily write a short GLM-based application that proves this.

    For example, when I do this:

    Code :
    void print_matrix(glm::mat4 mat)
    {
    	for (int row = 0; row < 4; ++row)
    	{
    		for (int col = 0; col < 4; ++col)
    		{
    			std::cout << mat[col][row] << "\t";
    		}
     
    		std::cout << "\n";
    	}
    }
     
    int main()
    {
    	auto proj = glm::perspective<float>(90.0f, 1.0f, 1.0f, 1000.0f);
    	auto inv_proj = glm::inverse(proj);
    	auto inv_inv_proj = glm::inverse(inv_proj);
     
    	print_matrix(proj);
    	std::cout << "\n";
    	print_matrix(inv_proj);
    	std::cout << "\n";
    	print_matrix(inv_inv_proj);
    }

    If perspective projection matrices were truly not invertible, then `inv_inv_proj` would not produce the same matrix as `proj`. But it very much does.

    So odds are good that this isn't a GLM or matrix problem; you're doing something wrong.

    Also, do read the wiki article I linked earlier. It gives you the math to do what you're trying to do step-by-step.

    why does it take so long to get replies?!?!
    It was one hour since your post. It is not reasonable to expect other people to be sitting behind their computers, refreshing the page with baited breath for your next update to this thread. This is a forum, after all; not a chat room.

    Also, spamming the forum with multiple threads on the same topic isn't going to get you an answer any faster.

  2. #12
    Intern Newbie
    Join Date
    Jan 2018
    Posts
    42
    UPDATE: I was wrong about bad projection matrix... a little. X and Y are processed fine, but depth is totally arse'd. It's always a single value after coming out of the projection matrix (0)

    the view matrix IS bad. It always comes out as all Zeroes.

    Right now, I have a fragment shader that demos what I believe to be correct values for the Eye space.

    I have to calculate the Z value manually from depth since the matrix doesn't like me

    Code:

    Code :
    #version 330
    out vec4 fColor[2];
     
    in vec2 texcoord;
    in vec2 ScreenPosition;
     
    uniform sampler2D tex1;
    uniform sampler2D tex2;
    uniform sampler2D tex3;
    uniform mat4 inverse_view_projection_matrix; //Does not work, always says every fragment is at the camerapos. In fact, it doesn't actually matter what vec4 you use with this matrix, it will always return the camera position, or at least something really close to it.
    uniform mat4 inverse_projection_matrix; //Works, but not on Z. produces incorrect Z values. X and Y values look approximately correct, but I'm suspicious.
    uniform mat4 inverse_view_matrix; //Does not work, always says every fragment is at the camerapos. In fact, it doesn't actually matter what vec4 you use with this matrix, it will always return the camera position, or at least something really close to it.
    uniform vec3 lightpos;
    uniform vec3 lightcolor;
    uniform vec3 camerapos;
    uniform float range;
    uniform float dropoff;
    //far and near clip planes
    uniform float jafar;
    uniform float janear;
     
    //Another thing that should work
    vec4 WorldPosFromDepth(float depth) {
        float z = depth * 2.0 - 1.0;
     
        vec4 clipSpacePosition = vec4(texcoord * 2.0 - 1.0, z, 1.0);
        vec4 viewSpacePosition = inverse_projection_matrix * clipSpacePosition;
     
     
    	//I don't actually know if this correct, and if the viewMatrix has any scaling, then this is incorrect. I have no idea how to solve this.
    	float depthRange = jafar - janear;
    	float farin = depth * depthRange + janear;
    	viewSpacePosition.z = farin; // Because the STUPID inverse projection matrix won't handle Z correctly! Not sure if supposed to be negative, not sure if supposed to be scaled.
     
     
    	//We now have the world relative to the camera. We have to move the world back to where it's supposed to be.
            //(means I haven't written it yet)
     
         vec4 worldSpacePosition = viewSpacePosition; //if I put in inverse_view_matrix * viewSpacePosition it always comes out as being at the camera position.
     
        return worldSpacePosition;
    }
     
    void main()
    {
    	//Grab the values from the initial opaque pass buffers
    	vec4 tex1_value = texture2D(tex1, texcoord);
    	vec4 tex2_value = texture2D(tex2, texcoord);
    	vec4 tex3_value = texture2D(tex3, texcoord);
     
    	//Get the Clip-Space Position (gl_Position of the fragment)
    	// vec4 clipSpacePos = vec4(ScreenPosition.x,ScreenPosition.y, (tex3_value.x * (jafar + janear))-janear, 1.0);
    	// vec4 clipSpacePos = vec4(ScreenPosition.x,ScreenPosition.y, 2.0 * tex3_value.x - 1.0, 1.0);
     
    	//Find the world position. PROJECTION MATRICES ARE NON-INVERTIBLE!!!
    	vec4 world_pos = WorldPosFromDepth(tex3_value.x);
    	//world_pos -= camerapos;
    	float mask = float(tex3_value.w != 0);
    	// fColor[0] = tex1_value;
    	fColor[0] = (world_pos ) * mask;
    	fColor[1] = tex2_value;
    }

    IF YOU KNOW HOW I SHOULD PROCEED, PLEASE TELL ME. I'M BUMBLING ABOUT AT THIS POINT.

    None of the matrix maths is working like how any tutorial shows it working.

    if you know what changes I should make to use matrices instead of having to bodge, i'd be more than thankful


    SO TURN ON YOUR BRAINS AND START THINKIN'
    Last edited by Geklmin; 05-13-2018 at 02:45 PM. Reason: communism

  3. #13
    Intern Newbie
    Join Date
    Jan 2018
    Posts
    42
    Also that wiki article you linked me (which I barely noticed because of poor highlighting and only noticed now because I accidentally hovered over it) doesn't explain how to get WORLD COORDINATES from GL_POSITION SPACE

    I have recreated (I believe successfully, but please read my code and see if i'm right) the gl_Position value of the fragment in the initial opaque (g-buffer... whatever) pass

    I have the inverse view matrix and inverse projection matrices, as well as the inverse viewprojection matrix in the shader

    I also have the near and far plane values to use

    I NEED TO TAKE THAT INFORMATION AND RECREATE THE WORLD COORDINATES

    As in

    AFTER MODEL MATRIX

    BEFORE VIEWMATRIX

    That's what I need for my lighting calculations

    I don't know what the heck "window coordinates" are and since I've begun coding this engine my understanding of the OpenGL coordinate systems has faded away as I've found buttloads of stuff in OpenGL that doesn't actually work

    Take gl_FragDepth for instance

    Did you know it doesn't actually work?

    It will always return the same value for every pixel on the screen.

    I verified this myself in a test program

    in fact, NONE of the gl_FragCoord stuff works AT ALL in my testing, and in order to get ANY meaningful results, I have to manually pass in vec2's and vec3's containing clip space coordinates

    Thank you for helping me so far, but i'm far from a solution

    I really can't believe that i'm the first one to solve this f***ing problem

    this is, by far, the single hardest part of OpenGL programming... other than actually getting your compiler working

  4. #14
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    6,003
    IF YOU KNOW HOW I SHOULD PROCEED, PLEASE TELL ME. I'M BUMBLING ABOUT AT THIS POINT.
    Step 1: CALM DOWN! Shouting is not going to get people to help you. I get that you're frustrated, but stop treating the people who are trying to help you like your own personal stress ball.

    Step 2: Stop blaming your tools:

    I don't know what the heck "window coordinates" are and since I've begun coding this engine my understanding of the OpenGL coordinate systems has faded away as I've found buttloads of stuff in OpenGL that doesn't actually work
    OpenGL actually works. As evidenced by the numerous programs that work in OpenGL, many of which use deffered rendering. It's a poor craftsman who blames his hammer for the way the nails get driven into the wood.

    Step 3: Reduce your code to a minimal, complete, verifiable example.

    It is extremely hard to follow your code as it currently stands. There's just too much other stuff there: all of your complex lighting code with its the manifold options, the numerous uniforms, and so forth. It's impossible to separate out the stuff that's important (what values you're writing to your gbuffer, and how you're using them in the lighting passes) from the stuff that's not (literally everything else).

    Try to reduce things down to the bare minimum.

    Also, it would really help to have some kind of guide to your code. Understanding what kind of texture is being `gl_FragData[2]`, and what `tex3` means, for example. It's difficult to know what image formats you're using for these textures, or whether they even are the same texture at all.

    That being said, I see several things that appear... dubious to me.

    For example:

    Code :
    vec4 big_gay = World2Camera * Model2World * vec4(vPosition,1.0);
    ...
    ourdepth = big_gay.z; //Depth
    ...
    ...
    (ourdepth+janear)/(jafar+janear)

    I do not know what you're trying to do with `ourdepth` here, but it does not make sense. Then again, I'm not sure what lives in `World2Camera` and `Model2World`. However, the fact that you shove "big_gay" (???) into `gl_Position` tells me that it is meant to include the projection matrix.

    If that's the case, then `ourdepth` is the interpolated clip-space depth value. Well, why are you adjusting it by the camera-sapce znear/far? This makes no sense mathematically, and I don't understand what you think intend for this to do.

    What you need to do is just use the depth buffer. Your "opaque" passes shouldn't be writing depth at all. Don't try to write `gl_FragDepth` or anything you compute for the depth. Just let the depth buffer do its job, then read from it in the lighting passes. That is where you should be getting your depth from.

    That's not going to fix your other code, of course.

  5. #15
    Intern Newbie
    Join Date
    Jan 2018
    Posts
    42
    Ah
    I will fix that, but how do I read from the depth buffer in GLSL?

    I apologize
    I am beyond frustrated, I have nearly broken my keyboard over this, it got cracked when I tossed it against the wall


    I thought this was going to be the easiest part of writing my graphics engine and it's turned out to be the single hardest thing i've had to do

  6. #16
    Intern Newbie
    Join Date
    Jan 2018
    Posts
    42
    OK wise guy
    How do I take
    ourdepth

    and normalize it for putting to an FBO

    My target OpenGL version doesn't allow depth sampling

    (3.3)

  7. #17
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,790
    Quote Originally Posted by Geklmin View Post
    Turns out projection matrices are non-invertible
    In general, that's incorrect. It's possible to create a non-invertible projection matrix (e.g. if the near and far planes are equal), but any projection matrix you're likely to want to use will be invertible.

    Quote Originally Posted by Geklmin View Post
    using glm::inverse gave me a bad matrix when I called it on the projection matrix and the viewprojection matrix
    Define "bad". If it didn't meet your expectations, that probably means that your expectations are wrong.

    You can check that an inverse is correct by multiplying it by the original matrix (the order doesn't matter) and checking that the result is an identity matrix (to within rounding error).

  8. #18
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,790
    Quote Originally Posted by Geklmin View Post
    how do I read from the depth buffer in GLSL?
    You don't. Specifically, you can't read from the "current" depth buffer, i.e. the one which will be updated by the current drawing operation.

    You can read the depth from previous drawing operations by first rendering to a FBO which has a texture as the depth attachment, detaching the texture from the FBO (or unbinding the FBO itself), then reading from the texture. But you cannot read from a texture (more precisely, a level of a texture) while it is being used as a render target. In fact, even the possibility of the texture being read is enough to trigger undefined behaviour, so textures which are used as FBO attachments shouldn't be accessible via sampler variables in the shader.

  9. #19
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    6,003
    OK wise guy
    How do I take
    ourdepth

    and normalize it for putting to an FBO
    Don't.

    The operation you're trying to do is to reconstruct the position of a fragment based on its depth. You already have its depth; you captured it in the depth buffer when you were rendering. You shouldn't be writing or computing this "ourdepth" value; OpenGL handles this just fine.

    You simply need to use that depth buffer as a texture when you do your lighting pass. That will give you exactly what `gl_FragCoord.z` had for that fragment.

    Once you read that depth value, you can apply the math needed to reverse the transformation and get back the position of the fragment.

  10. #20
    Intern Newbie
    Join Date
    Jan 2018
    Posts
    42
    UPDATE:
    I have found a workaround
    I am writing the worldposition of the fragment minus the cameraposition, divided by zFar, and mapped between 0 to 1

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •