Problem building a heat effect using a particle engine (with screenshots)

I’m developing a F-35B flight simulator. One of the aircraft features is the vertical takeoff. I’d like to show the heat effect coming out from the lift fan, and so far i have a working particle engine. What i would like to do, is to texture the particles to reproduce the heat effect, i.e. they must be transparent and perturbated (applying a distortion that simulates hot steam). So i rendered the whole screen to a texture, but now that i have the texture i don’t have any idea how to provide the correct texture coordinates to each particle, which has to show a portion of the screen-rendered texture.
Any idea ?
At the following page you can see a screenshot showing what is the current dynamic of the particle engine.
screenshot

It’s simple really. You can draw the texture to the framebuffer, then on each particle you can generate texture coordinates equal to their location on the screen (remapped to texture space). The final part is that in the middle of each particle you can add a vertex or vertices that can have a peturbed texture coordinate, the ammount of peturbation will show as distortion on the screen. It is important for the outside of the particles to match the background otherwise you’ll see an edge discontinuity that will look a bit like a solid refraction effect, almost like glass. In addition overlapping particles can show up as a discontinuity edge that will look like a refraction effect, so if you can’t avoid this it may be better to draw a billboarded thrust vector tube with many vertices and peturbation applied based on thrust simulation but I appreciate that the dynamics are potentially different from a particle system.

Stepping back a bit and reconsidering your approach you could imagine just a basic render to the normal framebuffer and a copy of a small screen region that bounds the thrust particles to texture ready for particle texturing.

Your particles should be z tested and they should be reasonably well behaved in the scene.

Thank you for your hints Dorbie. About the " then on each particle you can generate texture coordinates equal to their location on the screen (remapped to texture space)" you suggest, can you give me some more details ? I can’t figure out how to generate texture coordinates through opengl.

To do it really nicely, the best thing to do would probably be to use a fragment program and use Perlin noise to simulate the refraction.

If I were to do that, instead of using little circles, I think I’d use one big polygon to draw the hot area, then use that pass as input to a fragment program. This might be overkill for your use, but I think it would produce the best effect.

Eyespace texgen with a projection on the texture matrix and the usual shift for normalization can do this or a vertex program can grab the numbers directly, moving the center vertex in eye z and x y could help peturb it despite the projection. It’s up to you, remember that you need to work in your peturbation though and vertex programs can help give you more control there.

A bounding hull for the hot steam which is textured with stretched noise could be used as the perturbation map. The noise is stretched vertically for this vtol case and is translated downwards over time. Now u only need a good boundinghull approximation depending on the height of the aircraft from the ground and it should look great.

Sorry, tho this doesn’t exactly answer your question. But do post screenshots of your results :smiley: .

Thank you for your suggestions guys ! Anyway, the problem i can’t figure out is the following: i have this quad which represent the volume of hot steam and it is billboarded, so it always face the camera. The problem is that i need to translate the quad along the z and y axis to position it exactly under the lift fan. if i do that, the “portion” of the screen texture it’s not the same anymore, because of the translation. Probably is better if i post some code and a couple of screenshots to explain it better, because what i can’t figure out is how to calculate the texture coordinates in 3d space.

Screenshot explanations would be good.
I think the easiest way to do things, though, wouldn’t involve texturemapping the scene onto the polygon that you draw under the airplane, but rather to render a pass that will be basically black exept for under the airplane, and use that as a displacement map (with additional noise from a fragment program) to displace the normal rendering.

Yes Endash, i will post some screenshots soon. The technique you suggest is very interesting, there are some tutorials about it ?

Hi, i made some progress following endash suggestions. I first rendered a pass to a texture which is all white except a black cone (which is where the hot air is). After that i render the whole scene, and at the end i draw two quads (size of the whole screen): the first contains the black&white texture, the second contains the whole screen texture (previously saved). Then i blend the two quads so that only the screen texture matching the “black cone” will be drawn. But, if you look at the screenshot, i seem to have a blending problem: the cone correctly shows the “portion” of texture i want, but the rest of the scene is extremely bright. I suppose my blendfunc has incorrect parameters.
At the following link
you will find some screenshots and the code. Thanks

Originally posted by penetrator:
Hi, i made some progress following endash suggestions. I first rendered a pass to a texture which is all white except a black cone (which is where the hot air is). After that i render the whole scene, and at the end i draw two quads (size of the whole screen): the first contains the black&white texture, the second contains the whole screen texture (previously saved). Then i blend the two quads so that only the screen texture matching the “black cone” will be drawn. But, if you look at the screenshot, i seem to have a blending problem: the cone correctly shows the “portion” of texture i want, but the rest of the scene is extremely bright. I suppose my blendfunc has incorrect parameters.
At the following link
you will find some screenshots and the code. Thanks

You are currently simply adding the color of your two textures together. What you really want to is to modulate (multiply) them.

but i am multiplying them with glBlendFunc(GL_DST_COLOR, GL_ZERO);
Or not ? :frowning:

Your should render everything black except the area of the exhaust under the plane. Then write a fragment program that takes this black/white texture and the result of Draw_Everything() as a second texture and displace the texture coordinates at every pixel with some noise where the sampled color of the b/w texture is 1.0. That’s how i would interpret the post of endash.
Disadvantages: You need the fragment_program_ARB extension.
Advantages: Looks really nice :slight_smile:

i’m confused: the fragment program will be used to displace texture coordinates, and that’s ok, but it is next step i have to take. At this moment, i think there is a blending problem. I’m just trying to use one b&w texture as alpha channel of the successive texture.
Or, you mean that the fragment program can do that as well ?
Sorry for bothering.

For the fragment program you need as input the black/white texture that will work as a mask for the area the refraction should take place and the texture of your rendered scene with the airplane, terrain, etc. In the fragment program you do something like:

  • sample the b/w texture (R0) with original texCoords
  • calculate the displaced texture coordinates with original texCoords + (R0 * offset)
  • sample the scene texture withe the displaced texture coordinates and output this as the result.

This way you could also fade the effect out at the borders if you use not only black and white in the b/w texture but also some grey colors (so that R0 will not just be 1.0 or 0.0 but also anything between).

ok, i will search for some tutorials about fragment programs. Thanks !

I looked all over the web for some fragment programs. I ran into cg: ok, i know i’m a newbie but … what a mess. I find the documentation extremely complicate (and there are about a dozen of pdf files to “help”, each one is about 300 pages), not to mention the examples … at cgshaders.org i tried almost every shader and they are all missing files or resources. There is someone that successfully wrote a simple fragment program ? I really don’t know where to start from. Seems to me that the way to 3d programming is getting way too much complicate …

Try this link.

Its about glsl, at it has samples and everything.

I don’t recommend to use CG if your are new to fragment/vertex programs. It’s much easier to simply use the fragment_program_ARB extension and saves a lot of extra work. You can put in support for CG later on. Use this link and read the spec for the extension. It’s not that much different from vertex_program_ARB.

I bet now this topic “deserves” the beginners forum :stuck_out_tongue:
I found a very simple ARB Vertex & Pixel Shader, which render a quad. After studying the arb_fragment_program paper i started experimenting on both fp and vp, trying several commands. So far i think i’m getting the swing of it. However, i couldn’t find on the arb paper, how to manage more textures. Do i have to write one fp for every texture ? Or i can input more texture on one fp ?
The code of the pixel shader is the following, can you teach me how would i, for example, modulate two textures to blend them ?

!!ARBfp1.0

Fragment inputs

ATTRIB inTexCoord = fragment.texcoord; # First set of texture coordinates
ATTRIB inColor = fragment.color.primary; # Diffuse interpolated color

Fragment outputs

OUTPUT outColor = result.color;

TEMP texelColor;
TXP texelColor, inTexCoord, texture, 2D;

MUL outColor, texelColor, inColor; # Modulate texel color with vertex color
#ADD outColor, texelColor, inColor; # Add texel color to vertex color

END