Distance based fog

Hey :slight_smile:
I am kinda new to GLSL and I am searching for an idea for some kind of postprocessing-shader, that adds a distance based fog to the scene like seen on this image:
(I am not allowed to add URLs :(, search for " Atmospheric effects in games Igor Jerkovic")
This is a space-scene with some spaceships and asteroids. As you can see, the ships and asteroids in the background are colored with the fogcolor based on the distance to the camera, so it’s easier to know the scale of the objects. I guess I can work with the depth-buffer to achieve this effect, but my first experiments failed. My next try was a light-scattering shader. It works fine, but it won’t add fog to my (space-)scene:

vec2 deltaTextCoord = vec2( uv - starpos.xy );
	vec2 textCoo = uv.xy ;
	deltaTextCoord *= (1.0 /  100) * density;
	float illuminationDecay = 1.0;

	for(int i=0; i < 400 ; i++){
		textCoo -= deltaTextCoord;
		vec3 samp = texture2D(texture, textCoo   ).xyz;
		samp *= illuminationDecay * weight;
		EndColor += samp;
		illuminationDecay *= decay;
	}

But Igor Jerkovic says, that this effect is actually light-scattering. On the page are some images which show the effect I need. Do you guys have some ideas how it could work?

https://www.khronos.org/opengl/wiki/Sampler_(GLSL)#Shadow_samplers

you have to use shadow sampler types if you want to use your depth(-stencil) texture

for example, to make the fog / nebula depth-dependent, render a fullscreen quad, and blend …
–> the destination texel (your scene color) multiplied with (1 - depth value) with …
–> a certain const color (fog color) multiplied with the depth value

Thanks for the answer :slight_smile:

Sadly my engine doesn’t have a working depthbuffer-texture. That’s why I said, my experiments with it failed. When I render the depthbuffer to a texture, I get something, but not the real depthbuffer. Only the closest (really close) objects are visible on this texture.

https://www.khronos.org/opengl/wiki/Vertex_Post-Processing#Viewport_transform
https://www.opengl.org/sdk/docs/man/html/glDepthRange.xhtml

you have to consider that the depth range is [-1;+1], that means if you render something at the far end of your cameras “view frustum”, it will end up getting depth = +1, not your zFar value

whats wrong with your depth texture ?

It looks like it only renders the closest objects to the depth-texture. In my case it’s only the cockpit of my spaceship, while the asteroids and every other object doesn’t appear on it. Only when my camera is really really close to them they appear.

I tried your first post:

vec3 addfog(sampler2D occlusiontex,sampler2D depthtex, vec3 cfogcolor){
		vec3 depthColor=texture2D(depthtex,vUv.xy).xyz;
		vec3 sceneColor=texture2D(occlusiontex,vUv.xy).xyz;
		vec3 fogColor=sceneColor * (1-depthColor);
		vec3 fogColor2=depthColor * cfogcolor;
		
		return mix(fogColor,fogColor2,1.0);
}

It colors the whole scene with “cfogcolor”, since the depthtexture doesn’t work correctly.

After clipping and division by w, depth coordinates range from −1

to 1, corresponding to the near and far clipping planes. glDepthRange specifies a linear mapping of the normalized depth coordinates in this range to window depth coordinates. Regardless of the actual depth buffer implementation, window coordinate depth values are treated as though they range from 0 through 1 (like color components). Thus, the values accepted by glDepthRange are both clamped to this range before they are accepted.

The setting of (0,1) maps the near plane to 0 and the far plane to 1. With this mapping, the depth buffer range is fully utilized.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.