Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: Artifact with deferred shading on ATI cards

  1. #1
    Member Regular Contributor
    Join Date
    Mar 2007
    Location
    California
    Posts
    396

    Artifact with deferred shading on ATI cards

    I get this strange edge-highlight artifact with my deferred renderer on ATI cards. NVidia cards are unaffected.

    It seems to occur at the junction where unwrapped (fake cubemap) shadowmaps texcoords meet, at the junction between CSM stages, and at the edge where a pixel stops being affected by a spotlight, as seen in this image.

    It seems to be a function of the screen normal because normal maps affect the artifact.


  2. #2
    Advanced Member Frequent Contributor
    Join Date
    May 2005
    Location
    Prague, Czech Republic
    Posts
    913

    Re: Artifact with deferred shading on ATI cards

    I had seen something similar with latest ATI drivers in one from my postprocessing shaders. In my case the artifact appeared in pixels at border between two areas. Those two areas differed in the result of if branch put around additional texture sampling as speed optimization (i.e. the calculation would be correct even if the if test was not there). Since the texture in question does not have mipmaps (and thus it should not depend on derivative calculation in the quad) I am currently unsure if there is some legal reason for that artifact to appear or if that is problem with the shader compiler.

  3. #3
    Senior Member OpenGL Guru
    Join Date
    Dec 2000
    Location
    Reutlingen, Germany
    Posts
    2,042

    Re: Artifact with deferred shading on ATI cards

    Do you use anti-aliasing?
    GLIM - Immediate Mode Emulation for GL3

  4. #4
    Member Regular Contributor
    Join Date
    Mar 2007
    Location
    California
    Posts
    396

    Re: Artifact with deferred shading on ATI cards

    No anti-aliasing is used.

    However, a logical operation is causing some pixels to skip lighting since they face away from the light source. The artifact seems to occur at the border where logical results differ.

  5. #5
    Super Moderator OpenGL Lord
    Join Date
    Dec 2003
    Location
    Grenoble - France
    Posts
    5,580

    Re: Artifact with deferred shading on ATI cards

    Maybe you can avoid the artifacts by replacing the test with a step or smoothstep operation.

  6. #6
    Advanced Member Frequent Contributor
    Join Date
    May 2005
    Location
    Prague, Czech Republic
    Posts
    913

    Re: Artifact with deferred shading on ATI cards

    Quote Originally Posted by ZbuffeR
    Maybe you can avoid the artifacts by replacing the test with a step or smoothstep operation.
    The purpose of the logical operation (i.e. if branch), at least in my case, is to speed up the rendering of areas which do not need those additional calculations on cards which have dynamic branching capabilities. The mathematical operation will not do that.

    Maybe it might be possible to work around that artifact using something like:
    Code :
    result = 0.0 ;
    if ( some_condition( x ) ) {
        result = do_calculations() ;
    }
    result *= matematical_equivalent_of_the_condition( x ) ;
    use( result ) ;
    I will test it on monday.

  7. #7
    Member Regular Contributor
    Join Date
    Mar 2007
    Location
    California
    Posts
    396

    Re: Artifact with deferred shading on ATI cards

    ATI is pretty good about fixing errors if you submit a complete bug report. I'll send something to them soon.

  8. #8
    Advanced Member Frequent Contributor
    Join Date
    May 2005
    Location
    Prague, Czech Republic
    Posts
    913

    Re: Artifact with deferred shading on ATI cards

    I tried the workaround however simply calculating the result of the if branch into separate variable added after the if branch instead of doing the addition directly inside the branch stopped the artifact from appearing in my case so this is almost certainly a driver problem.

  9. #9
    Junior Member Regular Contributor
    Join Date
    Apr 2001
    Posts
    180

    Re: Artifact with deferred shading on ATI cards

    You might be aware of this, in that case, please ignore this, but perhaps it is related to what's described here: http://www.opengl.org/pipeline/article/vol003_6/ ?

  10. #10
    Member Regular Contributor
    Join Date
    Mar 2007
    Location
    California
    Posts
    396

    Re: Artifact with deferred shading on ATI cards

    I am using a GL_NEAREST filter on the depth and normal textures, and no downsampling is used, so I do not think that article is describing my problem. But I also didn't read it very carefully or understand what they were talking about.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •