Why are my UV coordinates being extrapolated without MSAA?

I am writing a project that uses a tile-based rendering system and have encountered the very common ‘lines between supposedly touching tiles’ issue.

Firstly, I checked to ensure all of my tiles were touching when uploading my data -> all okay, definitely the correct UV coordinates and data sent with no obvious rounding errors.
Secondly, I disabled MSAA and checked whether it was the UV coordinates bleeding or the floating point positions being off -> UV coordinates seemed to be bleeding in texture atlas.

I was surprised to see that when I wrote the following simple fragment shader (to test if the v_uv was within the UV coordinates I require):


#version 330

uniform sampler2D tex;

in vec2 v_uv;
out vec4 f_color;

void main() {
    vec4 color = texture(tex, v_uv);

    float tile_size = 128.0 / 2048.0;
    float top_left_x = 256.0 / 2048.0;
    float top_left_y = 256.0 / 2048.0;

    if (v_uv.x < top_left_x) {
        color = vec4(0.0, 0.0, 0.0, 1.0);
    }

    if (v_uv.y < top_left_y) {
        color = vec4(0.0, 0.0, 0.0, 1.0);
    }

    if (v_uv.x > top_left_x + tile_size) {
        color = vec4(0.0, 0.0, 0.0, 1.0);
    }

    if (v_uv.y > top_left_y + tile_size) {
        color = vec4(0.0, 0.0, 0.0, 1.0);
    }

    f_color = vec4(color.x, color.y, color.z, color.a);
}

The bleeding texture atlas now showed black lines instead of the original atlas bleeding. How can that be and why is it extrapolating the UV outside of my primitive’s edges by a small amount?

Even though I uploaded the UV coordinates of (0.125, 0.125) to (0.1875, 0.1875) for every tile (the texture is located on the 2nd column and 2nd row, 128px of a 2048px atlas).

TL;DR: Why is the fragment shader given a v_uv out of the primitive’s bounds and why does the vertex shader receive the correct UV, but fragment not?

How are you storing the texture coordinates on the GPU? If you use a normalised type, the denominator will be 2n-1, so you’ll get rounding errors.

I am uploading two buffers, position data (world-space) and UV data (normalized between 0.0 and 1.0 for power-of-two texture atlas), but the only numbers being uploaded are: 0.125 and 0.1875.

These two numbers fit perfectly within IEEE-754 so I don’t see any reason why there are any rounding errors - unless my understanding is wrong?

This is done once per frame to test and create the issue:

[ol]
[li]Initial setup, run once … then:[/li][li]Update model, view, and projection matrix[/li][li]Upload all UV data to the UV buffer[/li][li]Bind texture atlas to texture unit 0[/li][li]Upload all position data to the position buffer[/li][li]Bind vertex array with UV and position buffers[/li][li]Draw vertex array with GL_TRIANGLES[/li][/ol]

The black lines do, however, flicker and change location depending on where the camera is located at - but why should the position change affect the UV coordinates?

Here is a trace of all commands being executed, very simple:

Upon further inspection, I believe that my UV coordinates are being imprecisely interpolated between the vertex and fragment stages.
The vertex shader receives the exact floats, but the fragment shader receives a float just outside my bounds.

Why is this?

[QUOTE=ThomasRue;1292072]The vertex shader receives the exact floats, but the fragment shader receives a float just outside my bounds.

Why is this?[/QUOTE]
It’s impossible to say based upon the limited information provided.

Have you tried using transform feedback mode to capture the vertex shader outputs?

If anti-aliasing is enabled (either MSAA or GL_POLYGON_SMOOTH), it’s possible for vertex attributes to be extrapolated (i.e. the sample location can be outside the triangle).

[QUOTE=GClements;1292074]It’s impossible to say based upon the limited information provided.

Have you tried using transform feedback mode to capture the vertex shader outputs?

If anti-aliasing is enabled (either MSAA or GL_POLYGON_SMOOTH), it’s possible for vertex attributes to be extrapolated (i.e. the sample location can be outside the triangle).[/QUOTE]

No anti-aliasing is enabled, MSAA, or GL_POLYGON_SMOOTH. I have checked and the output of the vertex shader is correct. No errors in the UVs there and they are exactly as I sent them.
It’s definitely happening between the vertex shader and fragment shader - perhaps because the pixel center lays directly on the centre of a pixel causing inaccuracy.

Is it possible that the polygon interpolation causes the UV to go out of bounds by a tiny margin?

Really frustrated by this.

Without anti-aliasing, sample locations correspond to pixel centres. If the edges lie exactly on pixel centres, then some sample locations will be on the edge. Even without rounding error, that can result in the texture coordinates being interpolated in a 0:1 ratio, i.e. one set of coordinates will be used directly. So if a coordinate is 0.125 (2/16) on one edge and 0.1875 (3/16) on the other, the “interpolated” coordinate may be exactly 0.1875, which will end up as 0.18752048.0 = 384 = 1283 after denormalisation (i.e. on the boundary between texels from different tiles). Linear filtering will produce a 50-50 blend, while nearest filtering could select either texel.

With linear filtering, edges which are close to pixel centres may result in blending with the adjacent tile in an atlas if the sample location is less than half a texel from the edge of the triangle. With nearest filtering, the sample point would have to be either exactly on the edge or close enough to the edge for rounding error to push it over; which probably means that it needs to be exactly on the edge.

For a 2D tile map, this is most likely to occur if either of the viewport dimensions are odd and you fit an even number of tiles to that dimension. That would result in the centre of the viewport (and thus a tile boundary) lying exactly on the centre of a pixel.

In my vertex shader, none of the UVs sent out are less than 0.125 in horizontal direction. This is tested and true.
In my fragment shader, the following condition passes but should be impossible considering the above: if (v_uv.x &lt; 0.125) {

Regardless of whether a texture or not is used, the attribute is being slightly skewed between the vertex shader and fragment shader outside of my control.
I can have no sampling and no textures bound and any attribute I send out of my vertex and receive within my fragment will still be slightly out of my primitive’s mathematical edges.

From researching, this could possibly be due to barycentric coordinates used for interpolation being slightly off causing a rounding error during rasterization.
Nevertheless, I think it may be better to not trust the attributes received by the fragment shader and, therefore, I will have to architect a solution that works with marginal errors.

This implies that you have edges which are almost exactly coincident with pixel centres. For a 2D tile grid, it should be reasonably straightforward to prevent this from occurring. I.e. don’t allow the viewport to have an odd width or height, don’t do things which will result in the tile grid being offset by exactly half a pixel. If you don’t do that, having edges close enough to pixel centres for this to occur should be highly unlikely.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.