gl_FogFragCoord with no vertex shader

card: nvidia 6800GT
driver: 81.95

I’ve a simple shader, which consists of just a fragment shader.
The gl_FogFragCoord built-in varying variable seems to always be set to zero, even though I have exp2 fog enabled in the application.
I’ve tried “glHint(GL_FOG_HINT, GL_NICEST)” but get the same result.

vec4 applyFog(vec4 fragColour)
{
	const float LOG2E = 1.442692;	// = 1/log(2)
	float fog = exp2(-gl_Fog.density * gl_FogFragCoord * LOG2E);
	fog = clamp(fog, 0.0, 1.0);
	return mix(gl_Fog.color, fragColour, fog);
}

void main()
{
	gl_FragColor = applyFog(gl_Color);
}

Any ideas?

don’t really have an answer, but i thought i’d keep your thread alive.

i can’t help but wonder if having all that built-in opengl state is really a good thing. seems like a lot of grunt work for the driver guys, and for what? so we don’t have to set some table entry ourselves? i guess it’s the whole opengl experience thing, or something. wish i could be of more use, but as i’m learning glsl myself, and being simultaneously perplexed and irritated by some of the conventions i’m seeing, i’ve not much to offer in the way of positive comment.

I think the project come from the fact nVidia cards don’t use the fog coordinate if you use a view fog.

if you use:
glFogi(GL_FOG_COORDINATE_SOURCE, GL_FOG_COORDINATE);
gl_FogFragCoord you have a value

else if you use (as default):
glFogi(GL_FOG_COORDINATE_SOURCE, GL_FRAGMENT_DEPTH);
gl_FogFragCoord will be null…

The problem is maybe that fog isn’t enough define one GLSL spec. Look the isue number 81 from the GLSL spec. I don’t know what I should understand by that.

Also on ATI cards in think that in any case fog wasn’t support in GLSL…

This is a bug with the calculation of gl_FogFragCoord that will be fixed in a future driver release. You can work around this bug like so:

float fog = exp2(-gl_Fog.density * [b]abs/b * LOG2E);
The abs stdlib function is free on NVIDIA hardware so this won’t cost you any performance and will continue to work correctly once the issue is fixed in the driver.

Originally posted by Groovounet:
[b]I think the project come from the fact nVidia cards don’t use the fog coordinate if you use a view fog.

if you use:
glFogi(GL_FOG_COORDINATE_SOURCE, GL_FOG_COORDINATE);
gl_FogFragCoord you have a value

else if you use (as default):
glFogi(GL_FOG_COORDINATE_SOURCE, GL_FRAGMENT_DEPTH);
gl_FogFragCoord will be null…

The problem is maybe that fog isn’t enough define one GLSL spec. Look the isue number 81 from the GLSL spec. I don’t know what I should understand by that.

Also on ATI cards in think that in any case fog wasn’t support in GLSL…[/b]
The GL_EXT_fog_coord extension allows the application to provide its own fog coordinate (per-vertex). If FOG_COORDINATE_SOURCE is set to FOG_COORDINATE, applications are responsible for specifying the fog coordinate via glFogCoord/glFogCoordPointer. If FOG_COORDINATE_SOURCE is set to FRAGMENT_DEPTH, the fixed function pipeline or vertex shader is responsible for calculating the fog coordinate.

Here is an example vertex shader for use when FOG_COORDINATE_SOURCE is set to FOG_COORDINATE:

void main()
{
    gl_Position = ftransform();
    gl_FogFragCoord = gl_FogCoord; // pass through application specified fog coordinate
}

This is what your vertex shader would look like if FOG_COORDINATE_SOURCE is set to FRAGMENT_DEPTH:

void main()
{
    gl_Position = ftransform();
    vec4 eyeCoordPos = gl_ModelViewMatrix * gl_Vertex;
    gl_FogFragCoord = abs(eyeCoordPos.z/eyeCoordPos.w);
}

And this is my vertex shader look like when FOG_COORDINATE_SOURCE is set to FRAGMENT_DEPTH

Thank you for this explanation.

Originally posted by jra101:
This is a bug with the calculation of gl_FogFragCoord that will be fixed in a future driver release.
Ok, thanks for clarifying this. I’ll remove my unnecessary vertex shader next chance I get.
bonehead, thanks for resurrecting this thread so we actually got an answer.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.