View Full Version : Depth buffer writes?

01-29-2003, 04:49 AM
I have an idea to make a Soft Corona effect. From reading another article some where, I figured out you could simply increase or decrease the effect, by how much the light source is visable. So what i wanted to do was simply this. I wanted to render a scene, then I wanted to render my light meshes (Lamps, whatever) and test to see how much of it passed the depth buffer tests. Thus telling me how much of the corona to show. Make sense??

So my question is, is there any way to do this short of rendering my scene. Saving the depth buffer in an array. Rendering my Lights, saving the depth buffer in another array, Then testing EVERY depth value against one another?? There has to be a faster way, expecialy since I would have to do this for EVERY light, and that would get very very slow very very fast.

Any suggestions?

01-29-2003, 04:54 AM
NV_OCCLUSION_QUERY does exactly what you want. But obviously this will only work on NVIDIA cards. I really hope ATI comes up with a similar extension.

01-29-2003, 04:55 AM
Here's one: Read http://oss.sgi.com/projects/ogl-sample/registry/NV/occlusion_query.txt
(edit: Ha, perfect double post. http://www.opengl.org/discussion_boards/ubb/smile.gif)

[This message has been edited by Relic (edited 01-29-2003).]

01-29-2003, 05:01 AM
Yeah, i was hoping to do it without extensions. I am attempting to make my engine as OPEN as possible. (But still have some nice stuff). I am currently developing this for platforms such as LapTops and such, seems like they have ALOT of power, ALOT of ram, but very little video cards. So I am trying to throw as much as possible at the Vid cards, but only what they can handle.

I am aiming for Multitexture extensions, Compiled vertex arrays extensions etc; But as for any of the Gforce series extensions, taht wont work. Any ideas behond the extension runs??

01-29-2003, 05:10 AM
You could use readpixels. Maybe test 5 pixels of the light (1 in each corner, one in the centre), see what colour they are. You could test more pixels for more accuracy.

01-29-2003, 06:22 AM
Originally posted by Adrian:
NV_OCCLUSION_QUERY does exactly what you want. But obviously this will only work on NVIDIA cards. I really hope ATI comes up with a similar extension.

The Radeon 9500/9700 supports that extension. . .

01-29-2003, 06:29 AM
Ah yes so they do, excellent.

01-29-2003, 06:32 AM
The 8500 supports it too.

01-29-2003, 07:20 AM
Ok, the read pixels is a good idea, BUT, i dont know where the light will be on the screen at any given time, so how would i know which pixels to read??

01-29-2003, 07:41 AM
You could calculate the light's screen coordinates with something like this,

glGetDoublev(GL_MODELVIEW_MATRIX, LightM);
glGetIntegerv(GL_VIEWPORT, viewport);
gluProject(LightX,LightY,LightZ,LightM,LightP,view port,&winx,&winy,&winz);

01-29-2003, 08:55 AM
The occlusion querry works or you could use raycasting intersections. I implemented this in a demo a while back. Remember, don't need the full complexity of the geometry if ray casting, and if you're rendering the light & reading back you need only read back a tiny portion of the framebuffer.