PDA

View Full Version : local dereferencing for vertex arrays



dmy
02-27-2000, 02:36 AM
i don't think there is an extension like this yet, though it is much important for easy implementation of models.

this should be applied to existing vertex array scheme: will lead to some new specialized VA formats.

local vertex de-reference.

suppose we have to render a model with lighting and texture applied.
in this case we could do these operations:
-feed to opengl the array of vertices to be T&L'ed
-feed the array of texture coordinates
-feed the array of triangle mesh (3 ints, to say)
-tell opengl to render triangles.

the system will first do T&L on vertices, then will transform texture coordinates.
finally will start to draw triangles by taking the indexes and dereferencing them (on the card)

this way, no triangle fanning or stripping would be necessary, only the required informations will be sent and processed.

the hardware must, of course, have the ability to do memory hop-around, but it can be specialized to do only this sort of work.

i think it could save much time, avoiding costly operation of T&L, and save transfer bandwidth also.

this could be also a way to overcome the inability of vertex arrays to specify different texture coordinates per vertex.

what do you think?

Dolo/\/\ightY