Normal Mapped Sphere looks Flat Shaded

Hi All,

I’ve tried googling around for this, but all I can turn up are (functioning) tutorials. I’ve got a normal map for a sphere that I’m rendering in OpenGL 3.0 (context created with SDL2, using my own shader glsl 1.2).

The normal map itself looks ok (if not very crude), and even in my render I can see that the normal map values look pretty smooth (if not very ugly).

[ATTACH=CONFIG]1185[/ATTACH]

However, between faces I get these noticeable creases. I tried flipping glShadeModel(GL_SMOOTH) on, but I don’t know if that would have an effect. There is no actual texture map on the sphere, just a solid diffuse color. For my normal map, the light direction and half vectors are computed per vertex (using uploaded tangent and normal vectors) and varied to the fragment shader, but I thought that would be ok.

Is there a setting I need to flip on? Or is it my data that’s janked up?

For the record that sphere (and its UVs, normals, tangents) were exported from blender using the IQM format (though I don’t think that’s the issue, since the IQM SDK comes with an example where things look pretty smooth.)

The reason is almost certainly that the vectors forming your tangent space (tangent, bitangent, normal) are discontinuous at the edges of faces.

Either you aren’t interpolating them correctly (or at all), or the mesh was exported with per-face vectors rather than per-vertex vectors.

Interesting, I’ll post some code when I get home showing how I interpolate the values (I basically copied this tutorial’s logic, verbatim. The only difference is that I don’t use glNormal and upload a vec3 attribute manually.)

It’s possible that the normals I’ve got are face normals, but I find it suspicious that the normal map itself looks smooth. My lighting equation uses that normal map along with varying Half and Light vec3s computed in the vertex shader, and all light contribution is calculated in the fragment shader using those interpolated values.

Face normals and improper interpolation would explain the creases, but wouldn’t that also result in discontinuities along the normal map? Well… looking at the picture I suppose there are discontinuities… hmmm.

Hi Guys,

I hate to dump code on you, but here’s a summary of my vertex and fragment shader. Disclaimer: I edited down my shader and removed some calculations like distance attenuation to make things more clear (though the vertex shader’s still pretty long!), so there may be some syntax errors here and there.

Here’s the vertex shader


// Light Struct, just owns position and color (intensity)
struct Light{
	vec3 P;  // Position in world space
	vec3 I;  // Intensity (light color as RGB float)
};
uniform Light u_Light;

uniform mat4 MV_w;  // World space transform of geometry
uniform mat4 MV_e;  // Camera transform, brings world space => eye space
uniform mat3 N;     // Normal matrix, N = mat3( inverse( transpose( MV_e * MV_w ) ) )
uniform mat4 P;     // Projection matrix, takes eye space to screen space

//Vertex Attributes
attribute vec2 a_TexCoord; // Texture coordinate
attribute vec3 a_Position; // World position
attribute vec3 a_Normal;   // Vertex normal, normalized
attribute vec4 a_Tangent;  // Vertex tangent (normalized? IQM gives me a vec4...)

// To be interpolated for the fragment shader
varying vec2 v_TexCoord; // the interpolated texture coordinate

// These two are sent to the fragment shader in tangent space; x=>tangent, y=>bitangent, z=>vertex normal
varying vec3 v_HalfVec;  // the half vector between the eye vector and the light vector
varying vec3 v_LightVec; // The light direction ( light position - vertex position )

// Given three (unit) basis vectors, take v to a new space
vec3 changeBasis(vec3 v, vec3 nX, vec3 nY, vec3 nZ){
	vec3 v_T = vec3(0);
	v_T.x = dot(nX, t);
	v_T.y = dot(nY, b);
	v_T.z = dot(nZ, n);
	return v_T;
}

void main(){
	// Get world and eye position
	vec4 w_Pos = MV_w * vec4(a_Position, 1.0); // World space vertex position
	vec4 e_Pos = MV_e * w_Pos; // Eye space position
	
	// Interpolate texture coordinate
	v_TexCoord = a_TexCoord;
	
	// new basis (note that N is the normal matrix for the eye space transform; is that bad?)
	vec3 n = normalize(N * a_Normal); 
	vec3 t = normalize(N * a_Tangent.xyz); // What about w?
	vec3 b = cross(n, t);
	
	// Transform light direction to tangent space
	vec3 L = u_Light.P - w_Pos.xyz;
	v_LightVec = normalize(changeBasis(L, t, b, n));
	
	// Transform eye vector (- eye position) to tangent space
	vec3 E = -e_Pos.xyz;
	vec3 E_t = normalize(changeBasis(E, t, b, n));
	
	// Find half vector, in tangent space
	v_HalfVec = 0.5 * (v_LightVec + E_t); // These are both unit, so is this OK?
	
	// Find screen space position
	gl_Position = P * e_Pos;
}

and the fragment shader


// We still need the uniform light
struct Light{
	vec3 P;  // Position in world space
	vec3 I;  // Intensity (light color as RGB float)
};
uniform Light u_Light;

// Uniform material color, passed in from host
uniform vec3 u_Color;

// Normal Map
uniform sampler2D u_NormalMap;

// Same varying vectors from before
varying vec2 v_TexCoord; 
varying vec3 v_HalfVec;  
varying vec3 v_LightVec;

void main(){
	// Grab RGB color from normal map, take from [0,1] to [-1,1]
	vec3 nrm = 2. * texture2D(u_NormalMap, v_Tex).rgb - 1.;
	
	// Use cosine law to find light contribution
	float nDotL = max(0, dot(nrm, v_LightVec));
	float nDotH = max(0, dot(nrm, v_HalfVec));
	float powerFactor = nDotHV == 0 ? 0 : pow(nDotHV, 100.); // arbitrary shininess
	
	// Assume light color, specular color are both white
	vec3 lightColor = u_Light.I * (nDotL + powerFactor);
	
	gl_FragColor = u_Color * lightColor;
}

Long, but nothing too crazy. I’m wondering if I’m misunderstanding what my tangent vector is and/or misusing it in the shader. For example, IQM gives me a vec4 tangent, but the first three components seem to be a unit vector. I assume the fourth is length… also I’m not sure if the basis vector formed from the tangent is supposed to be N * a_Tangent.xyz, rather than MV * a_Tangent.

Anyway I can’t really expect any solutions from posting the above, the error is likely in the data I’m uploading or in my use of the tangent vector. I just thought I’d post it in case anyone spots a bug and wants to yell at me…

You need to normalise v_LightVec and v_HalfVec in the fragment shader. The result of interpolating between two unit-length vectors is a vector of less than unit length.

True, you’re right. I made that change, but it didn’t solve the crease issue. It must be something in my data…

EDIT:
Yup, that was it. When I exported the sphere in blender I didn’t have smooth shading turned on. I didn’t think it would matter, since my normals were being interpolated in the shader, but maybe Blender exports face normals unless that setting is turned on?

Anyway, thanks for the responses and for looking at my code.

[QUOTE=mynameisjohn;1272390]
Yup, that was it. When I exported the sphere in blender I didn’t have smooth shading turned on. I didn’t think it would matter, since my normals were being interpolated in the shader, but maybe Blender exports face normals unless that setting is turned on?[/QUOTE]
I would assume that exporting with flat shading will give each face a distinct set of vertices using the face normal, rather than vertices being shared between adjacent faces.