Texture coordinate generation

A philosophical question and a problem.
Question first…
Which is better from a performance and flexibility perspective. Using the texture coodinate generation techniques provided in openGL or assigning texture coords manually? I’ve struggled to understand the literature on this one and cannot grasp it. I also can’t find a decent code example anywhere.

Which leads me to my second question:
What’s the best method to apply a single texture to a mesh ? I’m thinking generate the texture coords by hand but I’m wondering if there’s a secret or something I’m missing.
How do the unReal/Quake/Halflife boyz do it? Is there a human outline or a region in the rectangular texture that they use and only generate texture coords inside that region ?

Where can I go for help. This board is my last hope. Gracias mes amis.

Francis

Francis,

Texture coordinate generation is a simple way of saying that the texture coordinates can be expressed as a function of other vertex properties. For example, if you have an object whose x and y vertices range from 0 to 1, then you may want to map the s and t texture coordinates exactly to the spatial coordinates. The fact that you can pass in a plane for each texture coordinate component allows you to encode any 4x4 matrix for conversion from [x y z w] -> [s t r q]. One nice feature is that you can use the eye or object space coordinates of the object. Another nice feature is that texture coordinates may be a function of any vertex parameters - for example sphere mapping and reflection mapping require both the position and the normal to calculate the texture coordinates.

As for what is better to use, it depends. If you are using an acclerator that has hardware T&L, then texgen is virtually free. If not, then it probably has to be calculated (by the driver) on the CPU. If you specify the texture coordinates manually or if the driver computes them on the CPU, then the coordinates must be sent over the bus, which costs bandwidth. If you have hardware T&L, using texgen costs practically nothing and saves bandwith.

Anyway, hope that helps!

Cass

a good way to texture a mesh is to use 3D studio MAX, wich has powerful and simple tools to apply textures (to assign texture coordinates).

AFAIK many game developers use MAX to do it:
i’m sure about ID games (the quakes) and Valve (half life).

Dolo//\ightY

I’m trying to use 3DS Max to get textures mapped correctly. I’m fine with Max, but when I move my object to OpenGL, the textures coords are all messed up.

I’m using 3d exploration to convert from a 3ds file to a c++ display list, but it doesn’t seem to mess with my stuff…

Any idea what my problem might be? And is there an easier way to use max files in OpenGL other than exporting to 3ds and then converting to c++?

what does s, t, r, q means?
like to make a envmapping effect with opengl, well i can do it =) the problem is that it is really fake envmapping and i dunno how to make it real… like use the normals and stuff or even a object to when the object passes in front of a place let imagine this, so when it goes by a plane we can see it correctly envmapped in that plane… anyone can clear this to me plz?
thanks