Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 3 of 3 FirstFirst 123
Results 21 to 24 of 24

Thread: lat-long/spherical projection using vertex shader

  1. #21
    Junior Member Newbie
    Join Date
    Feb 2017
    Posts
    14
    Quote Originally Posted by GClements View Post

    For lighting, the only requirement is that all of the vectors involved (vertex position, normal direction, light position/direction, eye position/direction) are in the same coordinate system, and that the coordinate system is affine to "world" space (i.e. no projection).
    I have encountered an issue where I am seeing a seam across the objects(normals to be exact) when the object goes behind the camera (z > 0).

    So even before for lighting, I use the normals directly as final pixel color for debugging. In this mode, I see a seam across objects when they pass behind the camera. Attaching a screenshot


    Since the normals do not undergo any kind of transformation(atleast in my code apart from MV matrix), I would expect that for every vertex and adjacent vertices, the normals would be continuous considering they are in the same space(modelview). I could imagine negative numbers could cause this, so I remapped the normalized normals to (0-1) for debugging and I still see the seams.

    What could be causing this seam?

  2. #22
    Junior Member Newbie
    Join Date
    Feb 2017
    Posts
    14
    The same issue is also seen with UV/ST coordinates.

  3. #23
    Junior Member Newbie
    Join Date
    Feb 2017
    Posts
    14
    I think I found what was causing it. I was setting the z component to v.z/r. Taking the absolute value alleviates it. But I still see "holes" across the poles and meridian.

    Setting z to a constant, like 1.0 or other, just causes the geometry to disappear or gives mangled vertices for a complex scene.

    Any recommendations as to how to handle z effectively?

  4. #24
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,523
    Quote Originally Posted by rawpixel View Post
    Any recommendations as to how to handle z effectively?
    I would just set it to distance, transformed to the range (-1,+1), i.e. zero maps to -1, the maximum distance to +1. The maximum distance will determine the precision of the depth buffer. W should be set to 1.

    If you have issues with surfaces being obscured by surfaces which should be behind them, that can't entirely be prevented; it's an inevitable consequence of trying to apply a non-linear transformation to the vertices. Tessellation should reduce it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •