texture mapping

Hi! i’m trying to implement texture mapping with opengl, but i want to use a matrix of binary data (that represents a volume) instead of an image. what can i do to pass the data ?

You can use a 3D texture, to store such a volume.

However, rendering of volumetric data-sets (like in medical applications) is quite a complex topic, so don’t expect, that getting the data into the hardware is the only step you need to do.

Though it depends heavily on what exactly you want to do. If you need to know more, tell us, what you are trying to achieve.

Jan.

what i need is to simulate a radiography… i have a ray producer ( with an intensity value), a volume ( represented by a matrix of values) and a film. I’ve read that is possibile to simulate this kind of things, using texture like arrays… so i need to read the volume data from a matrix, create a texture with the values , calculate the radiography and “print” the resulting texture on a simple quad… anyone can help?

So, this is something like a medical visualization app?

I have absolutely no experience with this, others can help you more. But you should google for “volume visualization” or “volume rendering”. AFAIK, what many people do, is that they use a fragment shader and then do something like sampling along the view-ray at several positions in their 3D texture. All samples are combined using some formula (depending on the kind of data, etc.).

So basically, yes, you do only render a quad. But writing the shader, that actually textures your quad is difficult.

Jan.

yes, it’s a medical application… i found something in the tutorials here http://www.gpgpu.org/developer/index.shtml#conference-tutorial , maybe you can help me to modify the code…