fog with blended polygons

since theres a lot of smart bastards here ill pick your brains once again
( im talking not about standard fog but a glsl program )
( this only concerning objects in the fog region == far away )

ideally this would be perfect
A/ somehow incorparate the fog into the final equation eg

vec3 final = mix( final_color, fog_color, fog_amount );
but this falls down when blending is enable eg ONE ONE because the fog will get ‘added’ into the framebuffer making it brighter (unless its black)

B/ this is how im doing it at present

*draw objects normally lit etc (in correct order)
*afterwards for each object draw a fog pass over it
this works to a degree the problem is with depthwrites off, thus a fogged polygon further away will overwrite a blended polygon closer to the camera

the only solution is to draw the polygons one at a time first with normal material and then with fog material, go onto the next closest polygon and repeat
this will work, but the downside is the 1000’s of statechanges

any other solutions?
zed

maybe missing something, but why do you not just pass a “blackifier” of the fog as shader variable if you are in a additive blend ?

i dont understand, u mean if the polygon uses ONE ONE then use a fog color of black?
hows that gonna help sure its just gonna turn the polygon black, against say a yellow background, ie it wont help the polygon blend into the background at all, hmmm i must of misunderstood

It seems I’m missing something too.
Why are you separating in two passes ? Why couldn’t you render objects textured and shaded and fogged ?

And as a second suggestion, the correct fog equation would rather be Pf = FogMul * Pi + FogAdd. FogMul and FogAdd are obviously taking depth into account.

For very nice looking fog (although quite a bit expensive at the vertex program level), you should read : http://www.imm.dtu.dk/pubdb/views/edoc_download.php/2554/pdf/imm2554.pdf

I implemented it, you can have a look on my ftp at this ip : 82.235.65.100 using “developer” as log (password is nothing). There are a bunch of cool screenshots. It handles sky color too, and the computation is based on 3 parameters, that are roughly fog density, pollution, sun position.
You can have very diffretn renderings depending on those parameters (heavy pollution low density gives some weird but funny renderings).

SeskaPeel.

And as a second suggestion, the correct fog equation would rather be Pf = FogMul * Pi + FogAdd. FogMul and FogAdd are obviously taking depth into accoun
is this correct?
Pf = final framebuffer color
FogMul = vec3(fogcolor * (1-fogamount))
Pi = vec3(color of the polygon from light texture etc)
FogAdd = vec3(fogcolor * fogamount)

btw cant get onto 82.235.65.100 yet seems busy or someit, ill try again in a few hours

Render things normally (fogless) first.
Then do a fog pass where you turn off blending altogether. Keep depth test as GL_LEQUAL.

Would this be enough?

For ONE ONE blending, just render opaque stuff with fog, then render transparent stuff with fog off.

is this correct?
Pf = final framebuffer color
FogMul = vec3(fogcolor * (1-fogamount))
Pi = vec3(color of the polygon from light texture etc)
FogAdd = vec3(fogcolor * fogamount)

No. You’re right for Pi and Pf, but FogMul is the extinction term, and is supposed to be within [0, 1]. FogAdd is the inscattering term, and should be too within [0, 1] although both could be above 1, but without tone mapping you’ll get ugly clamped white.
Very basically, extinction is all sun rays that are deviated from your eye, and inscattering is the exact opposite, the sun rays that are deviated to your eye. These rays are not plain vector + color, because this is different along the wavelength. The thesis approximates it with red, green, and blue wavelength.
These terms are computed using Rayleigh and Mie phase functions. Just read the thesis, it’s really easy and gives excellent results.

You’ll judge results by yourself when you’ll have downloaded screenshots from my ftp (a lot of person managed to do it, so you should be able too).

Working in both aeronautic simulation and video game has nice side effects for state of the art bibliography :slight_smile:

SeskaPeel.

thanks for clearing that up SeskaPeel, my concern though is with the blended polygons ONE ONE etc, i cant see it working with them, none of the screenshots have blended polygons in them (still cant see your website to check that out)

For ONE ONE blending, just render opaque stuff with fog, then render transparent stuff with fog off
V-Man, ive lost a few IQ points since going teetotal but hows that gonna merge in the blended polygons with the fog?

Originally posted by zed:
[quote]For ONE ONE blending, just render opaque stuff with fog, then render transparent stuff with fog off
V-Man, ive lost a few IQ points since going teetotal but hows that gonna merge in the blended polygons with the fog?
[/QUOTE]Let’s assume you render a room with fogging on.
Then you render a particle with blending(ONE, ONE)
It seems obvious to me that there is no need to turn on fog for the transparent poly.

I’m not able to lof onto Seska’s ftp. My ftp program wants me to put a password, and leaving it blank deoesn’t work. When I write “nothing”, it still doesn’t work.

It seems obvious to me that there is no need to turn on fog for the transparent poly
in my case the fog is used to hide things in the distance so they dont suddenly pop into existance, thus turning off the fog is not an option

anyways ive been doing some more thinking …

related to fog is lighting (as in it requires another pass)
ive come to the conclusion
transparent polygons are impossible to do correct in realtime with todays hardware

look at the following screenshots (shadows turned off) notice how the lighting is wrong
(since the lazer light pass comes after the sunlight pass)

concerning a reasonably complicated lighting method (shadows etc)
FACTS –
cause of the blending(depthwrites off) u must draw each polygon in order from furtherest to closest

the ways to get it working correctly are

*for each polygon
-loop through the lights binding shadowmaps/changing variables and then drawing the polygon
-repeat for the next light
-finally draw a fog pass (if applicable)

of course this method will kill framerate cause of all the statechanges per polygon

*for each polygon
-draw all the lighting in one pass (shader loops through the applicable lights)
-finally draw a fog pass (if applicable)

unfortunatly this will only work for simplistic lighting methods

solutions? none i believe with todays hardware, we just haveto minimize the visual error

http://uk.geocities.com/sloppyturds/nelson/2005_05_29A.jpg
http://uk.geocities.com/sloppyturds/nelson/2005_05_29B.jpg

i dealt with this not too long ago. i had a shader that solves for fog purely in the fragment shader based on the fragments distance from the camera and other factors irrelevant to the discussion.

in any case i needed to render a landscape with fog, then add transparent clouds to the landscape. pretty much your problem wouldn’t you say.

so what i basicly ended up doing for the blended fragments is to compute the colour of the fragment taking into account fog colour and then increase the transparancy of the fragment depending on how far away it is. effectively what this does is if the blended geometry is totally invisible due to fog, it gives it an alpha value 0 or whatever according to your function which would make the fragment disappear.

this of course assumes that you’ve already rendered the background behind the blended geometry.

for what it worth it looks good enough for me. if you like i can rummage up a screen shot.

As my ftp server semmes too complicated, I uploaded pictures on an http –> http://dev.succubus.fr

SeskaPeel.