PDA

View Full Version : Problem building a heat effect using a particle engine (with screenshots)



Alessandro_dup1
04-14-2004, 02:06 PM
I'm developing a F-35B flight simulator. One of the aircraft features is the vertical takeoff. I'd like to show the heat effect coming out from the lift fan, and so far i have a working particle engine. What i would like to do, is to texture the particles to reproduce the heat effect, i.e. they must be transparent and perturbated (applying a distortion that simulates hot steam). So i rendered the whole screen to a texture, but now that i have the texture i don't have any idea how to provide the correct texture coordinates to each particle, which has to show a portion of the screen-rendered texture.
Any idea ?
At the following page you can see a screenshot showing what is the current dynamic of the particle engine.
screenshot (http://www.web-discovery.net/temp/problem_explanation.jpg)

dorbie
04-14-2004, 03:34 PM
It's simple really. You can draw the texture to the framebuffer, then on each particle you can generate texture coordinates equal to their location on the screen (remapped to texture space). The final part is that in the middle of each particle you can add a vertex or vertices that can have a peturbed texture coordinate, the ammount of peturbation will show as distortion on the screen. It is important for the outside of the particles to match the background otherwise you'll see an edge discontinuity that will look a bit like a solid refraction effect, almost like glass. In addition overlapping particles can show up as a discontinuity edge that will look like a refraction effect, so if you can't avoid this it may be better to draw a billboarded thrust vector tube with many vertices and peturbation applied based on thrust simulation but I appreciate that the dynamics are potentially different from a particle system.

Stepping back a bit and reconsidering your approach you could imagine just a basic render to the normal framebuffer and a copy of a small screen region that bounds the thrust particles to texture ready for particle texturing.

Your particles should be z tested and they should be reasonably well behaved in the scene.

Alessandro_dup1
04-15-2004, 10:07 AM
Thank you for your hints Dorbie. About the " then on each particle you can generate texture coordinates equal to their location on the screen (remapped to texture space)" you suggest, can you give me some more details ? I can't figure out how to generate texture coordinates through opengl.

endash
04-15-2004, 03:31 PM
To do it really nicely, the best thing to do would probably be to use a fragment program and use Perlin noise to simulate the refraction.

If I were to do that, instead of using little circles, I think I'd use one big polygon to draw the hot area, then use that pass as input to a fragment program. This might be overkill for your use, but I think it would produce the best effect.

dorbie
04-15-2004, 04:45 PM
Eyespace texgen with a projection on the texture matrix and the usual shift for normalization can do this or a vertex program can grab the numbers directly, moving the center vertex in eye z and x y could help peturb it despite the projection. It's up to you, remember that you need to work in your peturbation though and vertex programs can help give you more control there.

krychek
04-16-2004, 01:57 AM
A bounding hull for the hot steam which is textured with stretched noise could be used as the perturbation map. The noise is stretched vertically for this vtol case and is translated downwards over time. Now u only need a good boundinghull approximation depending on the height of the aircraft from the ground and it should look great.

Sorry, tho this doesn't exactly answer your question. But do post screenshots of your results :D .

Alessandro_dup1
04-16-2004, 03:37 AM
Thank you for your suggestions guys ! Anyway, the problem i can't figure out is the following: i have this quad which represent the volume of hot steam and it is billboarded, so it always face the camera. The problem is that i need to translate the quad along the z and y axis to position it exactly under the lift fan. if i do that, the "portion" of the screen texture it's not the same anymore, because of the translation. Probably is better if i post some code and a couple of screenshots to explain it better, because what i can't figure out is how to calculate the texture coordinates in 3d space.

endash
04-16-2004, 10:19 AM
Screenshot explanations would be good.
I think the easiest way to do things, though, wouldn't involve texturemapping the scene onto the polygon that you draw under the airplane, but rather to render a pass that will be basically black exept for under the airplane, and use that as a displacement map (with additional noise from a fragment program) to displace the normal rendering.

Alessandro_dup1
04-16-2004, 12:27 PM
Yes Endash, i will post some screenshots soon. The technique you suggest is very interesting, there are some tutorials about it ?

Alessandro_dup1
04-17-2004, 03:52 AM
Hi, i made some progress following endash suggestions. I first rendered a pass to a texture which is all white except a black cone (which is where the hot air is). After that i render the whole scene, and at the end i draw two quads (size of the whole screen): the first contains the black&white texture, the second contains the whole screen texture (previously saved). Then i blend the two quads so that only the screen texture matching the "black cone" will be drawn. But, if you look at the screenshot, i seem to have a blending problem: the cone correctly shows the "portion" of texture i want, but the rest of the scene is extremely bright. I suppose my blendfunc has incorrect parameters.
At the following link (http://www.web-discovery.net/temp/help1.htm)
you will find some screenshots and the code. Thanks

Jens Scheddin
04-17-2004, 06:24 AM
Originally posted by penetrator:
Hi, i made some progress following endash suggestions. I first rendered a pass to a texture which is all white except a black cone (which is where the hot air is). After that i render the whole scene, and at the end i draw two quads (size of the whole screen): the first contains the black&white texture, the second contains the whole screen texture (previously saved). Then i blend the two quads so that only the screen texture matching the "black cone" will be drawn. But, if you look at the screenshot, i seem to have a blending problem: the cone correctly shows the "portion" of texture i want, but the rest of the scene is extremely bright. I suppose my blendfunc has incorrect parameters.
At the following link (http://www.web-discovery.net/temp/help1.htm)
you will find some screenshots and the code. ThanksYou are currently simply adding the color of your two textures together. What you really want to is to modulate (multiply) them.

Alessandro_dup1
04-17-2004, 06:39 AM
but i am multiplying them with glBlendFunc(GL_DST_COLOR, GL_ZERO);
Or not ? :(

Jens Scheddin
04-17-2004, 07:50 AM
Your should render everything black _except_ the area of the exhaust under the plane. Then write a fragment program that takes this black/white texture and the result of Draw_Everything() as a second texture and displace the texture coordinates at every pixel with some noise where the sampled color of the b/w texture is 1.0. That's how i would interpret the post of endash.
Disadvantages: You need the fragment_program_ARB extension.
Advantages: Looks really nice :)

Alessandro_dup1
04-17-2004, 08:00 AM
i'm confused: the fragment program will be used to displace texture coordinates, and that's ok, but it is next step i have to take. At this moment, i think there is a blending problem. I'm just trying to use one b&w texture as alpha channel of the successive texture.
Or, you mean that the fragment program can do that as well ?
Sorry for bothering.

Jens Scheddin
04-17-2004, 08:35 AM
For the fragment program you need as input the black/white texture that will work as a mask for the area the refraction should take place and the texture of your rendered scene with the airplane, terrain, etc. In the fragment program you do something like:
- sample the b/w texture (R0) with original texCoords
- calculate the displaced texture coordinates with original texCoords + (R0 * offset)
- sample the scene texture withe the displaced texture coordinates and output this as the result.

This way you could also fade the effect out at the borders if you use not only black and white in the b/w texture but also some grey colors (so that R0 will not just be 1.0 or 0.0 but also anything between).

Alessandro_dup1
04-17-2004, 08:41 AM
ok, i will search for some tutorials about fragment programs. Thanks !

Alessandro_dup1
04-17-2004, 03:13 PM
I looked all over the web for some fragment programs. I ran into cg: ok, i know i'm a newbie but ... what a mess. I find the documentation extremely complicate (and there are about a dozen of pdf files to "help", each one is about 300 pages), not to mention the examples ... at cgshaders.org i tried almost every shader and they are all missing files or resources. There is someone that successfully wrote a simple fragment program ? I really don't know where to start from. Seems to me that the way to 3d programming is getting way too much complicate ...

jonasmr
04-18-2004, 03:55 AM
Try this (http://www.clockworkcoders.com/oglsl/index.html) link.

Its about glsl, at it has samples and everything.

Jens Scheddin
04-18-2004, 05:51 AM
I don't recommend to use CG if your are new to fragment/vertex programs. It's much easier to simply use the fragment_program_ARB extension and saves a lot of extra work. You can put in support for CG later on. Use this link (http://oss.sgi.com/projects/ogl-sample/registry/) and read the spec for the extension. It's not that much different from vertex_program_ARB.

Alessandro_dup1
04-18-2004, 08:56 AM
I bet now this topic "deserves" the beginners forum :p
I found a very simple ARB Vertex & Pixel Shader, which render a quad. After studying the arb_fragment_program paper i started experimenting on both fp and vp, trying several commands. So far i think i'm getting the swing of it. However, i couldn't find on the arb paper, how to manage more textures. Do i have to write one fp for every texture ? Or i can input more texture on one fp ?
The code of the pixel shader is the following, can you teach me how would i, for example, modulate two textures to blend them ?

!!ARBfp1.0

# Fragment inputs
ATTRIB inTexCoord = fragment.texcoord; # First set of texture coordinates
ATTRIB inColor = fragment.color.primary; # Diffuse interpolated color

# Fragment outputs
OUTPUT outColor = result.color;

TEMP texelColor;
TXP texelColor, inTexCoord, texture, 2D;

MUL outColor, texelColor, inColor; # Modulate texel color with vertex color
#ADD outColor, texelColor, inColor; # Add texel color to vertex color

END

MickeyMouse
04-18-2004, 09:46 AM
The ARB specs on fragment programs is quite a long one, so you probably haven't read it thoroughly enough, but it's all in there.

ARB specs are complete descriptions of ARB extensions, but IMHO they could have been written in some nicer manner (not just plain TXT) and grouped, so they would be an easier resource to learn from for new OpenGL programmers.

For someone, who previously sticked with say Direct 3D and its nice helps, it's rather annoying to have to read / search through such long texts.
Just my opinion:)

Jens Scheddin
04-18-2004, 01:40 PM
Originally posted by penetrator:
I bet now this topic "deserves" the beginners forum :p
I found a very simple ARB Vertex & Pixel Shader, which render a quad. After studying the arb_fragment_program paper i started experimenting on both fp and vp, trying several commands. So far i think i'm getting the swing of it. However, i couldn't find on the arb paper, how to manage more textures. Do i have to write one fp for every texture ? Or i can input more texture on one fp ?
The code of the pixel shader is the following, can you teach me how would i, for example, modulate two textures to blend them ?

!!ARBfp1.0

# Fragment inputs
ATTRIB inTexCoord = fragment.texcoord; # First set of texture coordinates
ATTRIB inColor = fragment.color.primary; # Diffuse interpolated color

# Fragment outputs
OUTPUT outColor = result.color;

TEMP texelColor;
TXP texelColor, inTexCoord, texture, 2D;

MUL outColor, texelColor, inColor; # Modulate texel color with vertex color
#ADD outColor, texelColor, inColor; # Add texel color to vertex color

ENDYou can access almost every part of the current OpenGL state such as textures, texure coordinates, lights, materials, ...
"ATTRIB inTexCoord = fragment.texcoord[5];" will give you the texture coordinates of the 6th texture unit. IIRC, you even don't have to enable the texture units to use them. simply bind a texture and that's it. Oh, and the number of texture units accessible by the fragment program may be different (greater) than the number of texture units on the fixed func pipe, so you need to query this separately.

The following code will sample from two textures, modulate them together and output the resulting color:

TEMP tex0;
TEMP tex1;
TEX tex0, fragment.texcoord[0], texture[0], 2D;
TEX tex1, fragment.texcoord[0], texture[1], 2D;
MUL oColor, tex0, tex1;MickeyMouse:
From my opinion the DirectX help looks much nicer but has much less indepth content. So it's much harder to understand how to use some advanced functionality. But i agree that the specs could be written with some kind of html or so to make them easier to navigate and read (especially the ultra long vertex_program_ARB one).

Alessandro_dup1
04-18-2004, 02:13 PM
I have just loaded two textures (g_textureID and g_textureID2) in the initgl() function.
And this is the render() function:

glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glEnable( GL_VERTEX_PROGRAM_ARB );
glBindProgramARB( GL_VERTEX_PROGRAM_ARB, g_vertexProgramID );
glEnable( GL_FRAGMENT_PROGRAM_ARB );
glBindProgramARB( GL_FRAGMENT_PROGRAM_ARB, g_pixelProgramID );
glBindTexture(GL_TEXTURE_2D, g_textureID);
glutSolidTeapot(.5);
glDisable( GL_FRAGMENT_PROGRAM_ARB );
glDisable( GL_VERTEX_PROGRAM_ARB );
SwapBuffers( g_hDC );

How do i tell the fp that texture[1] is g_textureID2 ?
I searched all of the arb paper on how to load and use textures, but didn't find anything.
Thanks for helping

Lars
04-18-2004, 02:41 PM
Hi
Just one bit to the offtopic theme of not welformed extensions. I transformed all extensions into a compiled html file with indexing and some structure. You can find it under http://www.larswolter.de/OpenGL-Extensions.chm

And ontopic:
You use the glActiveTextureARB(...) function before binding, which takes as parameter GL_TEXTURE1_ARB to tell the gl that you want to bind something to stage 1. Dont forget to switch back to stage 0 by supplying GL_TEXTURE0_ARB. When you want to do something on stage 0.

greetings
Lars

Alessandro_dup1
04-19-2004, 07:47 AM
So do i have to use arb_multitexture ?

Lars
04-19-2004, 11:30 AM
I prefer using the extension (especially under Windows). But if you have an OpenGL 1.3 compliant implementation it is a core Function.

The use of glActiveTexture with fragment programs is described in the fragment program extension, under "Additions to Chapter 2". You can then also read about glClientActiveTexture() which could also be necessary for you to set different texture coordinates.

Lars

Alessandro_dup1
04-19-2004, 01:23 PM
Lars, i downloaded your .chm file and i want to thank you, finally arb papers are readable and indexed.
Regarding my questions about fp's, since i get an "undeclared identifier" running some opengl 1.3 commands, do i need to download other libraries and include files other than glut 3.7.6 ?

Alessandro_dup1
04-21-2004, 04:38 AM
Finally i found out how to make it work, thanks to everybody. If you take a look at this page (http://www.web-discovery.net/temp/help2.htm) , you will see that i'm rendering a simple quad with a texture which is the result of two "modulated" textures (through a fragment program). Any idea on how to get rid of the black area, making i.e. transparent with the blackground ? I tried with several blendfunc combinations and didn't work.

This is the render() function:

void render( void )
{
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glClearColor(1,1,1,1);
setShaderConstants();
glMatrixMode( GL_MODELVIEW );
glEnable( GL_VERTEX_PROGRAM_ARB );
glBindProgramARB( GL_VERTEX_PROGRAM_ARB, g_vertexProgramID );
glEnable( GL_FRAGMENT_PROGRAM_ARB );
glBindProgramARB( GL_FRAGMENT_PROGRAM_ARB, g_pixelProgramID );
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glActiveTextureARB( GL_TEXTURE0_ARB );
glBindTexture( GL_TEXTURE_2D, g_textureID_1 );
glActiveTextureARB( GL_TEXTURE1_ARB );
glBindTexture( GL_TEXTURE_2D, g_textureID_0 );
glLoadIdentity();
glTranslatef( 0.0f, 0.0f, -3.8f );
glRotatef( -g_fSpinY, 1.0f, 0.0f, 0.0f );
glRotatef( -g_fSpinX, 0.0f, 1.0f, 0.0f );
glColor4f(0,1,0,1);
glInterleavedArrays( GL_T2F_C3F_V3F, 0, g_quadVertices );
glDrawArrays( GL_QUADS, 0, 4 );
glEnd();
glDisable( GL_FRAGMENT_PROGRAM_ARB );
glDisable( GL_VERTEX_PROGRAM_ARB );
SwapBuffers( g_hDC );
}

Alessandro_dup1
04-21-2004, 04:44 AM
If you "feel" like it, at this link (http://www.web-discovery.net/temp/simple_shader.zip) you can download the complete VC++ project (it's about 270Kb). It's a modified version of a Kevin Harris tutorial which i found very useful.

Alessandro_dup1
04-21-2004, 02:54 PM
Ok, i think i fixed it, it works like i wanted to, but i'm not sure this is the "common" way to do it. I modified the fragment program as follows:

!!ARBfp1.0

ATTRIB inTexCoord = fragment.texcoord[0]; # First set of texture coordinates
ATTRIB inColor = fragment.color.primary; # Diffuse interpolated color
OUTPUT outColor = result.color;
TEMP tex0;
TEMP tex1;
TEX tex0, fragment.texcoord[0], texture[1], 2D;
SUB tex0.x, tex0.x, 0.01;
TEX tex1, fragment.texcoord[0], texture[0], 2D;
MUL outColor, tex0, tex1;
KIL tex0.x;
END

I've read the FRAGMENT_ARB paper, it specifies the syntax of all the commands you can use, but there is something more in detail, i.e. some examples on how to displace or warp a texture ?

endash
04-21-2004, 04:41 PM
Can you post a result screenshot? Also, what's loaded in texture 0 and texture 1?

Alessandro_dup1
04-22-2004, 01:09 PM
At this (http://www.web-discovery.net/temp/help3.htm) page you can see the two textures, and the result, which is exactly what i wanted. Here below is the fragment code, can you help me "translating" it into cG ?


!!ARBfp1.0
ATTRIB inTexCoord = fragment.texcoord[0];
ATTRIB inColor = fragment.color.primary; OUTPUT outColor = result.color;
TEMP tex0;
TEMP tex1;
TEX tex0, fragment.texcoord[0], texture[1], 2D;
SUB tex0.x, tex0.x, 0.01;
TEX tex1, fragment.texcoord[0], texture[0], 2D;
MUL outColor, tex0, tex1;
KIL tex0.x;
END

EvilE
04-22-2004, 03:02 PM
With regards to doing a refracting shimmer coming form a fan or exhaust, I did a similiar thing a while ago, but using DirectX mind you. (www.pulsefire.com/ernests) theres a divx video showing a heat shimmer caused by a fire. basicly what I did was render a sprite with a scrolling texture of a ripple into a fullscreen texture mask, then using this texture mask and a texture of the actual scene I perturbed the scene using the distortion scene using a pixel shader that samples from another part of the scene based on the amount of distortion in the mask. I believe this is how the half-life II effects are done (the big walker thing distorts the screen before it fires it's weapon), someone also told me metroid prime on gc does it as well. It's a really slick effect and all you have to do is render distorted objects into one buffer to effect the whole scene.

knackered
04-23-2004, 04:04 PM
EvilE, what you describe is a common technique - peturbation can be done at pixel or vertex level...I don't think a heat haze effect would benefit too much from being done at the pixel level, especially since it involves a dependent texture read, which requires a bit of grunt from the graphics card (and is impossible on anything below a geforce3). There's really no need to get pixel shaders involved in this one.
Stick with a reasonably tesselated mesh and do it per vertex, pretty much as dorbie suggests.

EvilE
04-23-2004, 05:00 PM
Well using the pixel shader is faster that using a mesh, and provides more accurate perturbation. We also used the distortion effect on rain drops rolling down on the screen. It's also easier to draw distortion to a seperate buffer than alter the uv's of a mesh.

knackered
04-24-2004, 01:55 PM
Only faster if you're t&l bound, but I always seem to be fill rate bound :)

Alessandro_dup1
05-03-2004, 12:59 PM
Thanks to all your suggestions and hints, i finally coded the first version of this so called "heat effect". On my web page (http://www.web-discovery.net) you can see two screenshots reproducing the scene with and without the heat blur. The effect is achieved in the following steps:
1) grabbing the screen to a texture
2) rendering the heat cone volume
to a black & white texture
3) using a pixel shader which multiply and lerp the above textures and a perlin noise generated texture (used to perturb the texture).

The animated effect looks quite good (also if screenshots don't tell much): one more thing to do, when i'll have the time, is to add a small particle engine and render it to the b&w hat cone texture to make it more "unstable" and fuzzy.

Thanks again to everybody for helping !

Jens Scheddin
05-04-2004, 02:10 AM
Nice :)
Keep it goin'

EvilE
05-04-2004, 03:59 AM
I'm sure it looks a lot better in real time, screenshots don't do the effect any justice. By the way I used a sprite with a rolling noise texture for my heat effect, and I faded it out (alpha blending) by distance. In your situation a particle would be best or simply a series of smaller sprites coming out of the exhaust.

Alessandro_dup1
05-04-2004, 01:04 PM
Hi EvilE, did you use a pixel shader to achieve your heat effect ?

EvilE
05-04-2004, 02:22 PM
yes, I draw to a distortion buffer and then use pixel shaders to offset the pixels of the scene by the distortion buffer.

dorbie
05-04-2004, 02:33 PM
Seems like overkill at first but it's a good way to avoid discontinuous refractive edges on overlapping distortion particle geometry.

endash
05-04-2004, 05:23 PM
For reference, there is a short section in GPU Gems on this sort of effect. It is section 6.5.2 Heat Shimmer. They basically did what you did. They used a particle system to make the displacement texture then used that result to distort the original.

An ATI screensaver uses a similar effect.

One problem I see with this is that the displacement aught to be proportional to the distance from the hot air to the distant object. Multiplying the displacement vector by the difference between the displacement depth buffer and the image depth buffer for the sceen could be a first approximation of this.