normal rescaling

about GL_RESCALE_NORMAL… is it a opengl 1.2 spec?

Dolo//\ightY

just found this in one of Mark Kilgard’s
article

“OpenGL 1.2 adds a new glEnable mode called GL_RESCALE_NORMAL”

thanks mike. another thing: in my application i render planets from unit spheres. as you know, vertex and vertex normals are equal vectors for such a sphere.
so, i uniform scale to the size i need, and obviously opengl scales the normals too.
so, i enable GL_NORMALIZE and what i find?
everything renders good.
and performance?
performances are the same! more, when normalization is enabled the application is faster, about 1 fps more !

the card is a geforce.
the tnt have the same behaviour.
the sphere has 950 verts and 1600 faces (more or less)
i use a vertex list to send data.

now i’ll test on a g200, and i’ll try out a bigger model and straight opengl soon, to see if i’m near the point where the speed of data send and normalizations become comparable.

any experiences about that?

tnx

Dolo//\ightY

[This message has been edited by dmy (edited 02-25-2000).]

I’m using triangles to create some funky looking spheres in my current project and its pretty much the same as the example code
out of the redbook - the chapter where a sphere is created by subdivision and yes I
had to scale them down

I just put it in a display list

// innermost spikeball
glNewList(dList+2, GL_COMPILE);
glEnable(GL_NORMALIZE);
glScalef(0.25f, 0.25f, 0.25f);
drawSpikeBall();
glDisable(GL_NORMALIZE);
glEndList();

if you want to see what it looks like I released as a screensaver a few months ago http://www.3dfiles.com/screensavers/borealis2.shtml

I’m starting to do a rework because I’ve gotten emails from GeForce owners about the
problems in our previous discussion