how would you bumpmap a sphere?

i’m trying to generate tangent space basis vectors like this (i hope you get the idea)

S vector: i subdivide my sphere into parallels, if seen from above you get many circles, for each vertex my s vector is tangent to these circles in a counter-clockwise manner

T vector: i subdivide my sphere into meridians, my t vector is tangent to HALF meridians, going from south pole to north pole

SxT vector: i simply use vertex normal

i’m aware of the fact that this could cause some siungularities at poles but that’s not the problem i’m discussing in this topic

unfortunately this method seem to generate incorrect results with directional lights (under some directions) and in particular with point lights
i mean, imagine i’m using a bubble-like normal map, some bubbles reflect light as if they were convex, some as if they were concave

working with cylinders or toruses causes no similar problem

i also tried to get my sxt vector by crossing s and t, and even tried to get t vec by crossing vertex normal and s vec, but no luck

every vector i’m working with is normalized so that should not be the prob

i’ve also looked into nvidia sdk and i can see that they are generating basis vectors with the algorithm kilgard described in his bumpskin paper, but i dont really get how the algorithm works (it also contains 12 variables for 9 equations?!?) and i guess there’s a much simpler way to generate basis vecs for a sphere

any hint on how to map it correctly wuold be highly appreciated

thanks

[This message has been edited by tellaman (edited 02-04-2003).]

Well you’re labelling s & t as I wouldn’t expect but it depends on how you’ve mapped the texture. The handedness of your coordinate frame may also be an issue.

I would have assumed that S maps radially and T vertically. i.e. S maps 360 degrees and T 180, for your two steradians.

The frame is normally made of your Tangent, Binormal and Normal. These are the ‘correct’ names for the vectors you are trying to generate. The Normal you know, the Tangent is the derivitive of S and the Binormal the derivitive of T.

So you have the Sphere coordinates (around the origin I assume) normalize them to give you the Normal. Your Tangent vector is typically the derivitive of the S texture coordinates and most 3 component bump maps are build with this assumption RGB -> TBN (tangent, binormal, normal) hence the light blue appearance of most DOT3 bump maps .5 .5 1 -> 0, 0, 1 after the half bias and scale is the flat surface basis that then get’s jittered for all DOT3 maps.

Anyway I digress. What you need to do is take the derivitive of S for your Tangent vector, this is easy for a sphere, it runs at a tangent to rings of latitude or ‘parallels’ as some call them. So you can compute this by taking the vector before and after the vertex in question on the ring and use the vector between them, normalized.

On your Sphere the Binormal is then just the cross product of the Normal and Tangent vector. B=NxT

This TBN coordinate frame should work for a sphere textured as I assumed at the outset and a normal map in the form that everyone uses today.

thanks a lot for your reply
i’m gonna look into it again though i’m sure i’m doing everything as you described

see you in a while

[This message has been edited by tellaman (edited 02-04-2003).]

first of all thanks a lot for your reply :>

i forgot to mention i had even tried the derivative way but failed all the same since it gives (almost) the same result as the technique i was describing in my post

and it is basically the same thing of mapping vectors as you said

anyway i’ll get a little more in depth:

sphere equation:

alpha running from 0 to PI
theta running from 0 to 2*PI

ven[a][b][0]=sin(alpha)*cos(theta);
ven[a][b][1]=cos(alpha);
ven[a][b][2]=sin(alpha)*sin(theta);

this works for both vertex and normal (sphere with unitary radius)

tangent vector:

vs[a][b][0]=cos(theta+pi/2);
vs[a][b][1]=0;
vs[a][b][2]=sin(theta+pi/2);

which gives the same result as your description

then for binormal:
vt[a][b][0]=ven[a][b][1]*vs[a][b][2]-vs[a][b][1]*ven[a][b][2];
vt[a][b][1]=ven[a][b][2]*vs[a][b][0]-vs[a][b][2]*ven[a][b][0];
vt[a][b][2]=ven[a][b][0]*vs[a][b][1]-vs[a][b][0]*ven[a][b][1];

my texture is wrapped linearly:

set texture at {cl(a),cl(b)} at the vertex described by (a,b), where cl(int) clamps an integer to [0,1] and returns a float

but this definitely doesnt work
i mean, imagine a situation in which i got a single directional light (hence no attenuation) set at, say (0,2,0) and pointing to (0,0,0):

this always gives a .5 value along parallels (0 clamped to 0,1) and values between 0 and .5 along meridians: http://www.lilien.it/temp/mewire.jpg
and http://www.lilien.it/temp/mesolid.jpg
(you can even try to pick the color along parallels and meridians in photoshop and see this behaves has i said)

but this doesnt work once you dot it with a normalmap
so i gave a look at nvsdk and saw that in exactly the same situation the colormap looks like this:
http://www.lilien.it/temp/nvwire.jpg
and http://www.lilien.it/temp/nvsolid.jpg

i’ve used the same photoshop trick just to find out that the basis vectors they are generating are samewhat confusing to say the least

where am i going wrong?
and has anybody ever tried to bump a sphere?

thanks again

anyone?
is this too hard to do?
please i need some help :>