Displacement Mapping/Coloring

Hey All,

A few quick questions if anyone can answer them.  My main goal is to create an arbitrarily shaped surface and color it.  The color of it's pixels is given by it's displacement and is very dynamic.  The shape of this surface is also very dynamic.
What I am doing right now is a very brtue force tatic of going through each bit of surface data and drawing the surface triangle by triangle, specifying the color at each vertex, and specifying a normal.  All in all it looks great but is running a bit on the slow side as drawing thousands of triangles (or at least going through the nested for loops) seems to be hogging resources.
I came across the idea of displacement mapping and figured that maybe specifying a texture would be easier/faster since maybe the hardware will be doing a lot of the computation.  Are there any tutorials on how to displcement map?  That way I can draw a simple plane (2 triangles) and use a texture instead of drawing thousands of triangles?  Also, would that even be feasible?  Since I have to color the surface anyway and the color is very dynamic, I may have to go through the nested for loops for the color anyway!  I could make a texture for the colors too, but that would require a few loops to get the color data and then how do I make sure the color would be on the surface (the surface would be a texture no? So textures on a texture??!?!?)  Any ideas?  I would love to have this run at a reasonable speed without downsampling.  Thanks!

-JonW

P.S. If there are good downsampling techniques or good estimators, I would like to see them anyway.

Displacement mapping isn’t available in OpenGL.

If you have a flat surface which should be converted to terrain type mesh with a high dynamic position and color change, you can specify a completely static mesh and have height and colors generated with a vertex program on the fly.

http://www.tooltech-software.com
I don’t know how it’s done, but it’s possible for sure…

I meant there’s no functionality on OpenGL where you put in a quad and a texture and get geometry with per texel displacement.
Of course there are plenty of software algorithms to calculate displacement mapping and visualize it with OpenGL.

But ToolTech’s solution isn’t software, it’s programmable hardware (vertex programs).

Ok, splitting some hairs here.
You’re talking about the image based rendering? Looks similar to http://www.cgshaders.org/shaders/show.php?id=44
You render a cube and calculate the color of the cubes face via some projection into textures containing images taken from six views around the object.
It’s nothing I would describe as “available in OpenGL”, but needs some serious amount of software.
Same would be true for any method which calculates a new mesh and renders that with full hardware support.
As long as there’s no glTexImage2D(GL_DISPLACEMENT_MAP…) or somesuch, it’s not an inbuild functionality.

I thought I remembered seeing a demo somewhere, where you could do Displacement-map type stuff for ATI cards using their N-Patch extension. GL_ATI_pn_triangles looks like it’s the one, maybe. Granted, it’s not as simple as glTexImage2D(GL_DISPLACEMENT_MAP…), but I think it makes it easier than doing all the calculations yourself.

I don’t remember where exactly I saw the demo, but a google search should turn something up. I’ve also never used this extension, so I really don’t know much about it other than that I saw it used in this demo.

Originally posted by Relic:
Ok, splitting some hairs here.
You’re talking about the image based rendering? Looks similar to http://www.cgshaders.org/shaders/show.php?id=44

Yes it’s quite similar, but that demo does it per pixel. ToolTech does it per vertex. Per pixel is the way to go in the future though…


It’s nothing I would describe as “available in OpenGL”, but needs some serious amount of software.
As long as there’s no glTexImage2D(GL_DISPLACEMENT_MAP…) or somesuch, it’s not an inbuild functionality.

Yeah. Sure. Agreed. Just wanted to say it’s possible in hardware (I mean on gfx card).
The best solution in most of these cases is to use display lists and modify them by vertex programs IMO.

[This message has been edited by dbugger (edited 08-14-2003).]

ATI_pn_triangles is only smoothing triangles by subdivision based on the normals given at the vertices.
There’s no way to get some dents into a single triangle with that.

[This message has been edited by Relic (edited 08-14-2003).]

Ahh, ok. I must be remembering something wrong then.

A quick google search turned up this: http://www-li5.ti.uni-mannheim.de/~hesser/Voxelgraphik/SS2003/Shaders.pdf

And this: http://mirror.ati.com/developer/gdc/GDC2003-DisplacementMapping.pdf

The second one has a reference to NPatches, but I guess it wasn’t exactly the ATI N-Patch extension, so maybe that’s where I saw it.

[This message has been edited by Deiussum (edited 08-14-2003).]

I wonder a little why the program is rather slow… drawing thousands of triangles each frame is not a problem on modern hardware, it should still run very fast. maybe there is something in your implementation that slows down? what about backface culling and culling in general? if it looks great, I would try to keep it that way and simply try to improve speed . maybe you can put all your data in a vbo and then “only” change the height and color data for each frame, and then draw the buffer… should at least be faster than immediate rendering calls.

Jan