Biasing cubemap causes faceting

I’m using the texture(gsamplerCube sampler, vec3 P, [float bias]) function to fetch samples for a number of lighting effects. Samples look as expected when there is no bias, but the higher the bias gets the more faceting I see on the surface. In other words, the triangles on my surface become distinct, as if there were little or no interpolation between the varyings I’m inputing as texture coordinates. I would expect to see a lower resolution mipmap according to my bias, but no discontinuity between triangles. The problem looks the worst when the bias is around 7.0 or 8.0. It seems to be gone once the bias is up around 20.0.

This problem occurs on Linux using both on NVidia and Intel graphics, so I doubt it’s a driver issue. I’ve also tried the deprecated textureCube function and gotten the same results.

Any suggestions?

You might post a picture with one such biased cubemap lookup in-play. Also, to eliminate other factors (such as your lighting) as contributors, consider posting the result of this single cubemap lookup written as the fragment color. That should help accentuate the effect you’re talking about.

Please also post a GLSL code snippet showing what value you’re feeding into the lookup and how that’s computed.

I assume you’ve defined cubmap MIPmaps (sounds like it) with verified-good data and have linear mag + trilinear min interpolation enabled.

Also, I doubt this is related to your problem, but does your GPU have seamless cubemap support and have you enabled it?

Thanks for the reply, Dark Photon. Here’s some more info. You’re correct that seamless does not affect the problem. Min filter is GL_LINEAR_MIPMAP_LINEAR and mag filter is GL_LINEAR.

vertex shader:

#version 130

uniform mat4 invViewMatrix;
varying vec3 wsNormal;

void main(void){
    wsNormal = normalize(gl_Normal);  // Implicit lib doesn't normalize?
    gl_Position = ftransform();
}

fragment shader:

#version 130

uniform samplerCube cubetex;
varying vec3 wsNormal;

void main(void){
    gl_FragColor = vec4(wsNormal, 1.0);
}

Here’s my input normal data. As you can see, everything is smoothly interpolated across all triangles.
[ATTACH=CONFIG]1222[/ATTACH]

And here is the cubemap lookup happening with no bias:
[ATTACH=CONFIG]1223[/ATTACH]

Here is the cubemap lookup with a bias of 8.0 and some bad faceting, but I expect it to be smooth:
[ATTACH=CONFIG]1224[/ATTACH]

Finally, here is a bias of 20.0 and the faceting is gone:
[ATTACH=CONFIG]1221[/ATTACH]

Please post the fragment shader with the cube map lookup.

Oops. That was sloppy.


#version 130

uniform samplerCube cubetex;
varying vec3 wsNormal;
 
void main(void){
    gl_FragColor = texture(cubetex, wsNormal, 8.0);
}

What’s the size of the cube map and what’s the texture pixel format?

I have tried both 512x512 and 2048x2048. Results are the same each way. The internal format is GL_RGB.

Hmm. Interesting problem. I don’t know what the problem is for sure. I’d say it might be a driver bug, but that’s much less likely given that Nvidia and Intel show the same result.

If you post a short, standalone test program that illustrates, lots of folks here could give you fast feedback on the results on many GPUs.

A couple observations about what you posted above:

Your faceted “log bias 8” picture doesn’t look like it’s a MIPmap for the same texture as the full-res with no lod bias. Is it?

Parts of the lod bias 8 picture look like there are large cubemap texels being stretched over large areas of the surface. However, in other places there is a break in the continuity, almost suggesting that the wxNormal being fed into texture() for adjacent points on separate triangles is somehow very different. Are the exact same vertex normals being fed in for all vertices on the surface regardless of which triangle they’re in? Probably so, because your input normal data is smooth.

You could try normalizing wsNormal before you do the lookup in the fragment shader, but that shouldn’t make any difference and you shouldn’t have to.

You said that’s LOD bias 8? Haven’t used lod bias on the texture() function, but I think that’d imply it’s looking up around the ~2x2 MIPmap level if you had 512x512 faces. Your “lod bias 8” picture apparently seems to have more detail than that, though that might be introduced by the texture filtering. The 2048x2048 cubemap should be looking up into the ~8x8 MIP I think. It’s definitely puzzling that you say the results are the same regardless of which resolution of cubemap you’re using. Is the native cubemap resolution >= 2048, or did you just upsample a 512x512 cubemap to 2048x2048?

You might try using an artificial cubemap with a wider range of colors procedurally assigned based on direction. Also you might change the the cubemap texture filtering to NEAREST and NEAREST_MIPMAP_NEAREST so you can see more clearly what’s going on. Using a sphere instead of some strange shape might help simplify this more as well. Also consider using textureLod() for testing instead which’ll let you nail which MIPmap you’re doing the lookup from. With this and nearest filtering you should be able to verify that at least the data being stored in each MIPmap level looks legit when sampled by the GPU.

I’ll keep thinking. This is definitely a puzzler.

I’m wondering if it’s a combination of bogus mipmaps (i.e. the different levels don’t resemble each other) and the normal’s derivatives being discontinuous across triangle edges (meaning that the fragments on one side of an edge are using a different mipmap level to those on the other side).

The latter is to be expected, and shouldn’t be in issue if the mipmaps are correct. But if the mipmaps aren’t correct (and the fact that the heavily-biased version is dark brown while the unbiased version is a light blue-grey seems to indicate this), then discontinuities in the LoD would translate to discontinuities in the resulting colour.

[QUOTE=GClements;1279008]I’m wondering if it’s a combination of bogus mipmaps (i.e. the different levels don’t resemble each other) and the normal’s derivatives being discontinuous across triangle edges (meaning that the fragments on one side of an edge are using a different mipmap level to those on the other side).

The latter is to be expected, and shouldn’t be in issue if the mipmaps are correct. But if the mipmaps aren’t correct (and the fact that the heavily-biased version is dark brown while the unbiased version is a light blue-grey seems to indicate this), then discontinuities in the LoD would translate to discontinuities in the resulting colour.[/QUOTE]
That would also explain why bias 20 gets rid of discontinuities (since at that point you’re definitely sampling from the lowest mip level all the time).

Thanks for the suggestions everyone. I played around with your ideas last night, but still no luck. The original cubemap resolution is 2048. I have played around with it at 2048, 512, and now 256. The change in resolution is observable, but the faceting does not improve.

Tried normalizing wsNormal in the fragment shader. No difference.

LINEAR_MIPMAP_LINEAR and NEAREST_MIPMAP_LINEAR both appear to have this extra faceting that I do not expect. I tried other filters and believe they are behaving as expected.

The problem persists if I use a plain old lat-lon sphere instead of my blobby geometry.

The mipmaps appear to be correct. I can observe them more clearly when using them as background imagery. I believe the difference in intensity in my screenshots is because I was inconsistent about which side of the object I was recording. If I build a simpler test program I’ll also build simpler navigation that I can control consistently. Anyway, maybe the biasing happens per-triangle instead of per-vertex and that’s causing the problem. Not sure if that’s what GClements is suggesting. Is it?

I didn’t remember to try Dark Photon’s suggestion about textureLod() last night. I’ll see what I can learn from that, maybe tonight.

One extra thing I tried was using a 2D texture and sampling it with wsNormal.xy. Doing that exhibits the same faceting problem. Also tried stripping every extra bit of OpenGL state setting out of my program with no effect.

I searched the Internet a bit more and still can’t find any other complaints about this problem. I’ve used the bias on texture() in the past with good results, but that was to assemble pretty multitextured starfield backgrounds. Don’t think I’ve ever tried it on curvy geometry. I guess the next big thing to try is to build a simple test program.

LoD is calculated per fragment from the approximated derivatives dFdx() and dFdy(). As the normal is linearly interpolated for each triangle, the derivatives (and thus the LoD) will be discontinuous across triangle edges.

Normally this shouldn’t matter, but if the different mipmap levels aren’t consistent, it may be noticeable.

How are the mipmaps being generated? With glGenerateMipmap(), or manually?

Okay, I now have at least some understanding of what GClements is saying about dFdx() and dFdy(). I read up on that stuff a bit more. When you call texture(sampler, texcoord), it’s the same as calling textureGrad(sampler, texcoord, dFdx(texcoord), dFdy(texcoord)). If you compute the output of dFdx() and scale it, you can see the discontinuities that GClements is talking about:

[ATTACH=CONFIG]1225[/ATTACH]

I could be off on this next part. The relationship between mipmap level and derivative is non-linear ( Mipmap level calculation using dFdx/dFdy - OpenGL - Khronos Forums ), so if you try to bias it linearly you might exaggerate any discontinuities. That might be the problem I’ve been seeing, but I would need to dig into it more to really understand better.

If that really is the problem, you would probably need to generate your own per-vertex derivatives and interpolate them for use in textureGrad() in the fragment shader. However, textureGrad() doesn’t even work on my Intel HD 4000 graphics under Linux. So unless there’s a newer driver that fixes that problem I’m not going to go any further.

Fortunately for me, Dark Photon’s suggestion about textureLod works fine for what I’m doing. You might get some shimmering on the edges with textureLod if you use high enough mipmap levels, but I’m not seeing anything like that in practice.

Anyway, thanks for the help everyone. Here’s the final lighting with some nice, warm ambient and subtle fresnel reflections. The part that I never mentioned in my original post was that this texture problem looked really bad when combined with environment mapped bump mapping, so it’s good to have that fixed too.

[ATTACH=CONFIG]1226[/ATTACH] [ATTACH=CONFIG]1227[/ATTACH]

Oh, and to answer and earlier question, the mipmaps are generated with glGenerateMipmap(). You can see how that looks pretty bad in the background imagery. Something I’ll need to work on later is blurring the mipmaps a little.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.