PDA

View Full Version : Light plumes



Plato
09-04-2002, 04:28 PM
Are there any demos out there that demonstrate light plumes (EDIT: actually, light blooms) using shaders?

[This message has been edited by Plato (edited 09-08-2002).]

rIO
09-05-2002, 01:36 AM
What do you mean by "light plumes" ??
lens flares ?

Plato
09-05-2002, 03:32 PM
Not a lens flare, more like a corona, except it can be applied to surfaces as well as points. For example, a light plume would be the glow effect around a light, regarless of the light shape.

The only way to simulate the light plumes before was to render a flare bitmap at you light source. The downside to this is that it only really works for point-light sources. Now with pixel shaders, I'm quite certain you can have arbitrary glowing shapes, all you'd have to do is convolve your lit surface (which could be a light, or a surface reflecting light) with some sort of point-spear function (like a gaussian, or even just a simple round pillbox).

The Halo 2 demo seems to have light plumes in it, as does the Radeon 9700 demo that I saw in the PR video. I was wondering if there are any other demos out there.

SirKnight
09-05-2002, 04:37 PM
Halo 2 and that radeon 9700 demo are using HDR (High Dynamic Range). A quote from the HDR presentation by NVIDIA:

"Enables realistic optical effects glows around bright light sources, more accurate motion blurs"

I advise you to check that presention out. HDR sure did make the lights in Halo 2 look totally awesome. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

Nutty
09-05-2002, 11:43 PM
Is this what you're referring to?
http://opengl.nutty.org/effects/index.html

2nd image. Click for demo with source. Doesn't use any shaders tho..

It looks alot better moving.

Nutty

pocketmoon
09-06-2002, 12:10 AM
Ah! Light BLOOMS.

Thats the term the bungie guys use when talking about the effect in the Halo2 movie.

Plato
09-06-2002, 10:01 AM
Wow! Thx nutty! That's awesome http://www.opengl.org/discussion_boards/ubb/smile.gif

I'll have a look as soon as I get home http://www.opengl.org/discussion_boards/ubb/smile.gif

Nutty
09-06-2002, 01:06 PM
Yeah I saw that discussion from Chris at bungie on GD Algo's list. He says they're using a destination alpha buffer to store the overbright light value, which is then convolved and applied to the scene similar to the demo I did.

I didn't use a dest alpha buffer tho, I assume using dest alpha limits your light blooms to a single pre-defined color?

Nutty

vincoof
09-06-2002, 01:19 PM
How do you determine which areas are "overbrightened" ?
Do you compare to pixels that are equal to (1,1,1) ? or do you render an additional picture with half intensity, and then look the areas brighter than (0.5, 0.5, 0.5) in it ?

SirKnight
09-06-2002, 01:21 PM
Hmm, that's a pretty interesting technique Halo 2 is using there. Well that may not be HDR exactly (but of course it couldnt be since the xbox doesnt have a NV30 or R300) but it sure looks very similar to it. It looks damn good.

-SirKnight

Nutty
09-06-2002, 01:28 PM
Basically in the Register combiners, when the final light value is calculated, it's normally clamped to 0 to 1. So subtract 1, and this will give you the fragment values, that are over 1, and can be stored. If it was less than 1.0 (not overbright), then it'll result in 0, and not contribute to the light bloom.

Thanks must go to Davepermen for describing this to me.

Nutty

JasonM [ATI]
09-07-2002, 07:27 PM
You can read about some of the implementation details of the blooms in our High Dynamic Range demo in my notes (http://www.ati.com/developer/SIGGRAPH02/ATIHardwareShading_2002_Chapter3-1.pdf) and slides (http://www.ati.com/developer/SIGGRAPH02/ShadingCourse_Mitchell.pdf) from the SIGGRAPH 2002 Hardware Shading Course.

-JasonM @ ATI

Pentagram
09-08-2002, 11:23 AM
I also tried to add some sort of fake dynamic range thing to tenebrae since that halo2 video really looked too cool.
Here's a shot: http://users.pandora.be/hollemeersch/blackrose/tenebrae/beta/quake75.jpg http://users.pandora.be/hollemeersch/blackrose/tenebrae/beta/quake79.jpg
(The links may sometimes not work since a lot of people are on my poor site.)
This runs on generic hardware so no special stuff needed, but it is slow.
It works as follows:
1) Render the scene in a small viewport (64*64)
2) Get the pixels (glReadpix)
3) All pix with one of their components greater than a certain threashold multiply with some value 1.2 currently.
Other pixels are made black
4) Blur the buffer (gamasutra fast blurring stuff)
5) Make a texture from it and render a screen sized quad.
It costs a lot of framerate but looks verry cool. http://www.opengl.org/discussion_boards/ubb/wink.gif The main cost is still rendering the small viewport. But speeding it up seems not easy since I already use all general cominers and destination alpha for doing bumpmapping.

Charles

[This message has been edited by Pentagram (edited 09-08-2002).]

Nutty
09-08-2002, 11:29 AM
Very similar to my method. Except I do my bluring on the gpu, with lots of render copy to textures, at sucessfully smaller texture sizes.

Try that new NV extension which allows you to do asynchronous read pixels. May help you to speed it up. Dunno if it's out yet, or if it's only for Nv30.

Nutty

davepermen
09-08-2002, 11:54 AM
hm, now quake1 looks bether than doom3.. scary, not? http://www.opengl.org/discussion_boards/ubb/biggrin.gif next will be the softshadows implemented from the other dude, and, oh well.. we could sell quake1 again, HEHE http://www.opengl.org/discussion_boards/ubb/biggrin.gif

yeah, try to do the blurring and all in hw, does not need any nonglstandardfeatures.. only glCopyTexSubImage2D..

can't wait for an r300 tenebrae, hehe http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Jambolo
09-08-2002, 10:31 PM
Nvidia's Cg SDK has a shader demo (with source) that does what you are looking for.

Plato
09-09-2002, 09:06 AM
These effects are awesome - and so cheap to do it seems. Nutty, your demo runs very quickly on my GF 1 DDR.

The only downside for the mipmap-blurring method is that at certain mipmap levels the blur is blocky. I turn the mipmap level down in the demo and it's extremely convincing.

BTW, are you able to make "glowing" objects this way as well? i.e. have overbright values for a 3d object, so that you could render, say, a bright light tube with a light bloom/HDR effect?

Nutty
09-09-2002, 10:56 AM
Yep, you could easily do that. You could have a special render shader, that automatically guarantees some over-brightness when rendering. Ideal for rendering lights and stuff. Then they'll always contribute to the glow.

The best bit, is that even views through a mirror etc.. they'll still glow, without any clever determination. It'll just happen!

I'm thinking of doing another one, but based on the Halo2 approach. i.e. using destination alpha. You dont need to render the scene twice using this method either. I'll also try and get the filtering a bit better to try and remove the blocky edges.

Nutty

Plato
09-09-2002, 11:20 AM
I should get a card with pixel shaders... I could do so much more. However, I'm amazed that a lot of these effects work on my GeForce 1 DDR. Even tenebrae runs better than I expected: I don't usually fall below 15 fps at 640x480x32 with all detail on, and usually am in the 20's.

Now if only there was a vertex program used to smooth out those animation keyframes, Quake 1 could be modernized http://www.opengl.org/discussion_boards/ubb/wink.gif .

Is anyone here running an R300? If so, what's the driver quality like for coding the more advanced features of opengl? Would I be better off waiting for an NV30?

Nutty
09-09-2002, 11:28 AM
Driver quality should be pretty good. As for features, I think they're waiting for DX9 to be released before they release the functionality of the card through gl extensions and ARB_fragment_program etc..

Me personally, I'm waiting for NV30, as I like the fact that nv_fragment_program, and nv_vertex_program2, are very similar to nv_vertex_program, which I'm very familiar with. So it should be a breeze to code for.

Nutty

NitroGL
09-09-2002, 11:29 AM
Originally posted by Plato:
Is anyone here running an R300? If so, what's the driver quality like for coding the more advanced features of opengl? Would I be better off waiting for an NV30?

I have one.

For advanced OpenGL stuff on it, you would have to wait 'til the next driver release since the current public drivers don't have support for ATI_fragment_program (I've been told that version 02.3 of the drivers will have it).

Other than that, it's just a few small bugs here and there (so far, the only major one is ARB_depth_texture doesn't work right).

Plato
09-09-2002, 01:15 PM
Hmmmm... as much as I like competition for nVidia, and would like to support a Canadian company (I'm Canadian, not Greek http://www.opengl.org/discussion_boards/ubb/tongue.gif), I'll hold off until the NV30 comes out, or the Ati drivers are fixed up a bit - I've always hated running into a problem, then first assuming it's my fault, then investigating the problem and finding out it's someone else's bug -- too much wasted time http://www.opengl.org/discussion_boards/ubb/smile.gif .