again on displacement texmap and lokup cubemaps

hello,
i am studing a bit the problem of displacement texture mapping. I am looking at the “frustum DisplacementTexture” demo, and i am having hard time understandig how does the lookup cubemap actually works.
I understand that usually it is useful to have precomputted normals cranked into a cube map, just to make normalizations on the fly…
If you have a minute i’ll show my questions

Now the displacement texture is somehow created as a 3D texture. First question: a 3D texture is just a texture with more than one layer, isn’t it? 4 example, if i had a piece of wood cutted i should see the wood color even in the section, right?

More questions here: now the lookuptex is created like:

//looking how to map the vector into the x,y texel
vec3 DisplacementTexture::getCubeVector(int i,int size,int x,int y) {
//QUESTION1: NOW,HERE IN THE CUBE MAP AM I WORKING IN
//WORLD OR TEXTURE SPACE COORDS?
//QUESTION2: WHY DO I SUM .5 TO THE X AND Y COORDS?
float s = ((float)x + 0.5) / (float)size * 2.0 - 1.0;
float t = ((float)y + 0.5) / (float)size * 2.0 - 1.0;
vec3 v;
switch(i) {
//QUEST3: THIS IS JUST A USUAL NORMALIZATION CUBE MAP, ISN’T IT?
case 0: v = vec3(1.0,-t,-s); break;
case 1: v = vec3(-1.0,-t,s); break;
case 2: v = vec3(s,1.0,t); break;
case 3: v = vec3(s,-1.0,-t); break;
case 4: v = vec3(s,-t,1.0); break;
case 5: v = vec3(-s,-t,-1.0); break;
}
v.normalize();
return v;
}

//MAIN CREATION LOOP
for(int i = 0; i < 6; i++) {
unsigned char *d = data;
for(int y = 0; y < size; y++) {
for(int x = 0; x < size; x++) {
//QUESTION4: HERE AM I IN WORLD OR TEXTURE COORDS?
vec3 dir = getCubeVector(i,size,x,y);
//QUES5: WHY VEC HOLDS JUST X AND Y COORDS? I AM LOOKING INTO A CUBE MAP, SO I SHOULD HAVE DIFFERENT CASES -> CONSANT Z=0???
vec3 vec(dir.x,dir.y,0);
vec.normalize();
//QUEST6: THIS VEC3 (010) POINTS TO THE “UP” DIRECTION OF OGL OR IS JUST THE STANDARD NORMAL OF THE SURFACE??? IN THIS CASE ISN’T Z THE VALUE I NEED (X=S,Y=T,Z=NORMAL)? WHAT DOES H REPRESENTS?
float h = acos(vec3(0,1,0) * vec) / (2.0 * PI);
if(vec.x < 0) h = 1.0 - h;
//WHAT DOES V REPRESENTS?
float v = acos(dir * vec) / PI;

//I GUES 255 IS NEEDED JUST TO PACK THE VALUES CORRECTLY INTO A TEXTURE OBJECT…
*d++ = (unsigned char)((h / (float)vertical + (int)(v * vertical) * 1.0 / (float)vertical) * 255.0);
*d++ = (unsigned char)((h / (float)vertical + ((int)(v * vertical) + 1) * 1.0 / (float)vertical) * 255.0);
*d++ = (unsigned char)((v * (float)vertical - (int)(v * vertical)) * 255.0);
}
}
glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + i,0,GL_RGB,size,size,0,GL_RGB,GL_UNSIGNED_BYTE,data);
}

this is more or less all, sorry if my questions seems too lame, but i am really a bit stuck on this thing…

thanks 4 the help
the gunslinger

Interesting.
1.) Whatever you do with it. It’s a generic vector space, righthanded, I would say.
2.) To calculate the center of the texel. This is especially important for the cubemap edges! To use the texel corner (x,y) would be wrong, because OpenGL texture look up is done in the center.
3.) Yes, looks like one.
4.) Generic. You send texel coordinates and get a vector from the origin to the inside of the cubemap side i, texel (x, y)
5.+6.) If dir.x and dir.y specify the plane on which the view vector is projected (z=0) to calculate the view vector’s theta (0-2pi degrees) in spherical coordinates. The acos() only returns values for 0-pi degrees, you need to take care for angles bigger pi by programming a case (x < 0) you need to do acos(2pi-dir.y
6.) Optimize: vec3(0,1,0) * vec == vec.y
But that looks strange. If x is the tangent and y the binormal then z is the normal. For the spherical phi you would only need a acos()
6a.) v repesents the spherical angle theta of a hemispherical view direction.
7.) I don’t know what vertical is. Some resolution step inside the “4D” texture. which is used to encode a function of (phi, theta, u, v) per pixel.

Cubemap, now that’s an idea, I need to go back to my drawing board now…

[This message has been edited by Relic (edited 02-20-2004).]

hi relic,
it is interesting…
actually the quality of displaced and horizon shaded bump textures is amazing.
Anyway… I wouldn’t appear a lamer… all this stuff isn’t part of my work, i am simply studing on it (and getting stuck).
If u’d like to see it running, dnload the sources from http://frustum.org
in the 3d section (DisplacementTexture demo). There are a couple of really nice pieces of work there.
Thx fow the hints, i’ll be working on em ASAP. Cya
gunslinger

Actually the cubemap is not used for normalization but to get a dependent texture lookup coordinate for the 3D displacement texture.
E.g. you have a view direction in in modelspace, do a basis transformation to texture space, normalize the view direction, do a lookup in the cubemap, find the texture coordinates inside the displacement map, fetch that, do some calcuations for final 2D texture lookup and lighting, “simple” as that.
Read the link to the SIGGRAPH work of Lifeng Wang amd Xi Wang I posted in the great “Better bumpmapping” thread.

The code snippet you posted generates a lookup table used for horizon mapping. This table takes the form of a cube map, which maps a 3D vector to its corresponding spherical coordinates.

The getCubeVector() function takes as input the side of the cube map you’re working on (one of six), the resolution of the cube map side, and two pixel coordinates x and y within the selected side of the cube map. It returns a 3D vector that you can use as a texture coordinate in order to access the given pixel.

It’s used in the cube map creation loop to calculate the spherical coordinates corresponding to each pixel in each of the six cube map sides. The spherical coordinates are called v and h in the code, with h being the “compass heading” of the vector in the “ground plane”, and v (stands for “vertical”?) being the elevation of the vector relative to the horizon (0 degrees means it’s on the horizon, 90 degrees means it’s directly overhead).

Now for your specific questions:

  1. See above. You’re not actually in the cube map, you’re calculating the texture coordinates that point to a particular texel.

  2. What Relic said. By adding half a texel, you create a texture coordinate that points to the center of the texel rather than to its corner.

  3. No, it’s nothing of the kind, because you’re not actually constructing a cube map in that part of the code.

  4. Here, you take the current cube map side (i), resolution (size) and pixel coordinates (x,y) and calculate the corresponding texture coordinate (dir).

5+6) The Z component is discarded as part of the spherical coordinate calculations. The dir vector is projected onto the ground plane and normalized, so you now have two vectors, vec and dir, that point in the same direction on the “compass” but one of them (vec) is in the horizontal plane.
Then, it seems that * is overloaded to mean a dot product. Hence, “vec3(0,1,0) * vec” calculates the cosine of the angle between vec and (0, 1, 0). Apparently Z points up instead of Y in this code, so (0, 1, 0) is a vector that points “north”.
To complete the “compass angle” calculation, the arc cosine of the dot product is taken to convert the value into an angle. At this point, h represents the compass heading of the dir vector.
Next, a similar operation is used to calculate the elevation angle v: “dir * vec” calculates the cosine of the angle between the original vector and the projected one, which corresponds to the cosine of the elevation angle. Same story: arc cosine is used to convert it to an angle.

The values are divided by 2*PI and PI respectively to map them into a [0,1] range, then expanded into [0,255] range to store them into the cube map.

Questions?

– Tom

<shameless_plug>Oh, did I mention I have much better horizon mapping code on my site, www.delphi3d.net ? </shameless_plug>

– Tom

thx to you too tom,
I had already stopped by your page a couple of times, but even if I like your screenshots, I don’t know delphi, so i did’t really look at your programs :expressionless:

My fault… I know