PDA

View Full Version : parallax mapping



caffeine_rv
05-11-2007, 07:58 AM
I have a trouble with applying parallax mapping shader to a non-square polygons. :confused:

For example when I put whole texture (square area) on non-square geometry (e.g. trapezoid) it looks distorted then.

Guys any thoughts on this? Is it possible to apply this technique to meshes that consist of different quads, sphere, for example?

knackered
05-11-2007, 10:11 AM
you understand why it looks distorted, surely?

caffeine_rv
05-11-2007, 10:34 AM
It looks distorted when part from texture has different shape than geometry. All tutorials & docs I read use square (GL_QUADs) and put whole texture on it.

What should I do in case of incomplete shapes? In my case it is square in texture space and trapezoid in world space.

I suspect root of the problem is different "offset" vectors in different parts of primitive. In case of square it is constant -

P = -E.xy * scale / E.z; (glsl)

Any thoughts?

I don't need general solution, but only in case of trapezoid...

k_szczech
05-11-2007, 11:23 AM
Simply use trapezoid portion of texture when applying to trapezoid geometry.
Otherwise tangent and binormal vectors may vary across polygons in a way you won't be able to control easily - cn be platform dependent since different hardware may split quad into a pair of triangles in different way.

Well, you actually answered yourself:

It looks distorted when part from texture has different shape than geometryAt least knackered had some laugh again... ;)

caffeine_rv
05-11-2007, 11:38 AM
Simply use trapezoid portion of texture when applying to trapezoid geometry.It's wrong way.

The trick is to map square to trapezoid. I'm applying parallax on a part_of_sphere mesh which consists of set of trapezoids. Whole image is rectangular, and I get square subimages and pass them as textures to this trapezoidal parts.

Imagine what could be in case of trapezoid texture parts %)

caffeine_rv
05-11-2007, 11:42 AM
Otherwise tangent and binormal vectors may vary across polygons in a way you won't be able to control easilyWhy do you think so? Actually I compute TBN matrix in vertex shader (it's easy in case of sphere, i need only a normal for it (which are already specified with geometry)). Then all stuff is interpolated and passed to frag shader...

dorbie
05-11-2007, 06:21 PM
Subdivide or use the 4th texture coordinate homogeneous w to apply the 'trapezoidal' projection.

Jackis
05-12-2007, 05:11 AM
You cannot simply apply quad texture to trapezoid without some perspective tricks, just as dorbie said.

Also, I don't know, how parallax would deal with projective texturing.

Here are some links on fourth texture coordinate:
http://www.r3.nu/~cass/qcoord/

caffeine_rv
05-12-2007, 05:36 AM
Thanks for link, Jackis. Maybe this is it...

However, I think GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST does the same, doesn't it? I tried to use it but with no luck...

Bob
05-12-2007, 07:05 AM
Perspective correction hints has nothing to do with it. The way the texture coordinates are interpolated over a quad is wrong to begin with. If the texture coordinates are not what you want them to be in the first place, you cannot just hint for them to magically become correct. You need perspective correct texture coordinates in the first place to get a perspective correct interpolation.

Jackis
05-12-2007, 07:24 AM
No, perspective correction is always enabled on modern chipsets, and this hint actually does nothing. And the main thing is that you can't simply apply rectangular texture region to trapezoid.

EDIT: Bob was the first one

k_szczech
05-12-2007, 09:12 AM
Otherwise tangent and binormal vectors may vary across polygons in a way you won't be able to control easily
Why do you think so?That's why:
http://ks-media.ehost.pl/opengl_org/trapezoidtangent.png
Each quad is rendered as 2 triangles and that makes tangent space change suddenly in the middle of GL_QUAD if texture shape is different than polygon.

caffeine_rv
05-14-2007, 09:40 AM
OK. I implemented 'q' coordinate usage but only for 's' coordinate to prevent shrinking in vertical direction. It looks much better, but not excellent. See this
screenshots gallery (http://picasaweb.google.co.uk/sima.rv/Screenshots) . Notice hills bending on second image.

Also I rejected to use smoothed normals.
What I'm doing wrong? k_szczech, could you please help with proper TBN handling in this case? Is there any applicable solution?

Jackis
05-14-2007, 12:27 PM
One little question...
You said - you did perspective division only for 's' component - so, as I understand, you had to do it manually, leaving 't' as is, wasn't it?
I ask it, because OpenGL pipeline generally divides all three 'str' components by 'q', if projective texture lookup is made.

About your pictures. Seems, like somewhere in your formulaes, you forgot that you are working in somehow projected space (I mean, you apply texture shift before/after projective texlookup was done, and you forget to consider, that you have different T derivative's normas).

k_szczech
05-14-2007, 01:25 PM
Your problem seems not to be related to the situation I presented on the picture.
It rather looks like some bug in your code - some value is not divided by another or something like that.
It just seems like when performing displacement your shader travels faster in t coordinate than s coordinate.

By the way - for testing purposes you should probably use a texture with regular shape - it's easier to find some small non-linear distortions this way.

caffeine_rv
05-15-2007, 01:17 AM
you had to do it manually, leaving 't' as is, wasn't it?Yep, I specify only funky 's' & 'q' coordinates in glMultitexCoord4f(...). But in fragment shader I divide 's' only: gl_TexCoord[0].s /= gl_TexCoord[0].q. As I said, I choose such way to avoid vertical shrinking as displayed in your earlier posts' link.

use a texture with regular shapeI tried that, just at the beginning of my investigations. Full texture on sqare geometry. I works well.

Besides, I can see some distortions even if I map square texture to rectangle (not sqare geometry).

I suppose this can be a result of

travels faster in t coordinate than s coordinate.PS I tried out different divisions on offset vector in frag shader. This is really annoying thing... :(

caffeine_rv
05-15-2007, 03:00 AM
rectangle & square (http://picasaweb.google.com/sima.rv/Screenshots2/)

rectangle is 2:1 (x:y)

k_szczech
05-17-2007, 05:14 AM
Pay close attention to how you calculate vector used for travelling across texture space.
Look at what values you use for s, t and q (depth).
Perhaps s is taken from normalized vector and t from unnormalized?

Simply compare codepath for s and t with codepath for q.

caffeine_rv
05-17-2007, 08:22 AM
I found way to resolve case with rectangle. I scale this P vector in that direction.

But the real problem is, how to scale it in case of trapezoid. Even step is not constant during the search. (this causes non linear banding of the hills in my screen shots, in case of rectangle they are linear).

All this non-linear mappings make me sick :eek: . I have no idea how to apply 'q' coordinate to P vector. Guys it is urgent. Any links are useful for me (I haven't found any on this topic yet...)

I need your help...


Simply compare codepath for s and t with codepath for q. Could you please give any example on this? In my case (geometry is sphere) I can compute q coord in vertex shader:

q = sqrt(1.0 - gl_Vertex.y * gl_Vertex.y)
gl_TexCoord[0].s *= q;
In fragment:

gl_TexCoord[0].s /= q;
In such way I got rid of specifying q in program... It really works.

caffeine_rv
05-18-2007, 04:55 AM
Here (http://picasaweb.google.com/sima.rv/Differences)
you can see pairs of "should be" & "currently is" pictures. Notice that on first search level (height is 1.0) they are similar, but on next levels we can see obvious differences (non-linear !!!).

"Should be" examples I rendered using non-square texture region (affine texture mapping).

I've found Paul S. Heckbert's "Fundamentals of texture mapping and image warping" there is some useful info on bilinear / affine / projective mappings. However, I can't realize how to apply non-linear P changes. Any new ideas?
It's _not_ "normalize" issue for sure...

caffeine_rv
05-24-2007, 10:00 AM
Here is some code from my test application
fragment shader:

uniform sampler2D NMAP;

varying vec4 L;
varying vec4 H;
varying vec2 P;
varying float q;

const int ns = 40;

void main(void)
{
gl_TexCoord[0].s /= q;

float dB = 1.0 / ns;

vec2 dC = P * dB;
vec2 C = gl_TexCoord[0].st + dC;

float cH = texture2D(NMAP, C.st).a;
float pH = 0.0;
float B = 1.0 - dB;

while (cH < B)
{
pH = cH;
B -= dB;
C += dC;
cH = texture2D(NMAP, C.st).a;
}

float t = (B + dB - pH) / (cH - pH + dB);

C += P * (dB * t - dB);

vec4 N = texture2D(NMAP, C.st);

vec3 Nt = N.xyz;
Nt = Nt * 2.0 - 1.0;
Nt = normalize(Nt);

float diff = max(dot(Nt, normalize(L.xyz)), 0.0);
float spec = max(dot(Nt, normalize(H.xyz)), 0.0);

spec = pow(spec, 512.0);

gl_FragColor = gl_FrontMaterial.ambient + (diff *
gl_FrontLightProduct[0].diffuse + spec *
gl_FrontLightProduct[0].specular);
gl_FragColor.a = smoothstep(0.0, 1.0, N.a);
}vertex shader:

const vec4 EYE = vec4(0.0, 0.0, 0.0, 1.0);
const vec4 SRC = vec4(0.0, 0.0, 0.0, 1.0);

varying vec4 L;
varying vec4 H;
varying vec2 P;
varying float q;

uniform float s;

void main(void)
{

vec3 T;
vec3 B;
vec3 N;

N = normalize(gl_Normal);
T = normalize(vec3(N.z, 0.0, -N.x));
B = cross(N, T);

mat4 iTBN = mat4(
T.x, B.x, N.x, 0.0,
T.y, B.y, N.y, 0.0,
T.z, B.z, N.z, 0.0,
0.0, 0.0, 0.0, 1.0
);

vec4 E = gl_ModelViewMatrixInverse * EYE - gl_Vertex;
E = iTBN * E;

L = gl_ModelViewMatrixInverse * SRC - gl_Vertex;
L = iTBN * L;

H = E + L;

P = -E.xy * s / E.z;

q = sqrt(1.0 - gl_Vertex.y * gl_Vertex.y);

gl_Position = ftransform();
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_TexCoord[0].s *= q;

} rendering routine:

glBegin(GL_QUADS);
for (int i = 0; i < T; ++i)
for (int j = 0; j < P; ++j)
{

float3 N = 0.25f * (
vertices[(i + 0) * (P + 1) + j + 0] +
vertices[(i + 0) * (P + 1) + j + 1] +
vertices[(i + 1) * (P + 1) + j + 1] +
vertices[(i + 1) * (P + 1) + j + 0]);

glNormal3fv(N);

glTexCoord2f(0.0f, 0.0f);
glVertex3fv(vertices[(i + 0) * (P + 1) + j + 0]);
glTexCoord2f(1.0f, 0.0f);
glVertex3fv(vertices[(i + 0) * (P + 1) + j + 1]);
glTexCoord2f(1.0f, 1.0f);
glVertex3fv(vertices[(i + 1) * (P + 1) + j + 1]);
glTexCoord2f(0.0f, 1.0f);
glVertex3fv(vertices[(i + 1) * (P + 1) + j + 0]);
}
glEnd();Here verices represents array of points on unit sphere (normal vectors
are the same in fact)

float3 N = 0.25f * ... - flat normal calculation.

Jackis
05-24-2007, 10:32 AM
caffeine_rv

As I see, now the problem is that you calculate your texture coordinates with some perspective hacks, BUT you don't consider this hacks in other interpolators (L, H, P; especially P, because as I understood, P is texture-space parallax direction, and it has to follow the same rules, as texture coordinates). So it is clear, why picture is wrong.

Could you please post simple test program - so we can play a little with you pixel shaders to see what we can fix. It is so lazy to write test program with your shaders.