PDA

View Full Version : Indices for indexed primitives with VAR



TheMyriad
01-18-2003, 08:46 PM
I was playing around with nvidia's VAR extensions on a GF3 Ti200 and I noticed this:
If I store my indices (I'm using indexed triangles) as well as my vertex data in AGP memory, I end up with a lower framerate than if I'd used system memory for everything.
If I store the indices in system memory and the vertex data in AGP memory, then I get the expected increase in framerate.
Is that the usual behavior for VAR?
If so, is there any better way to store the indices in AGP/video memory for better performance?
Thanks again everyone!

Jan
01-19-2003, 12:05 AM
Read NVīs VAR spec. It says:
DONT STORE THE INDICES IN AGP OR VIDEO MEMORY!

And think of it, itīs absolutely logic, cause you (that means your program), are not supposed to use the cards memory for any other stuff then textures and vertex data. You only tell the card which vertices to use. If those vertices are on the card, it will be faster. But glDrawElements is still performed on the CPU, that means outside of the graphics card. So if it has to work with indices, that are inside the graphics card, that it has to read those values and reading from AGP or Video memory is slow.

Jan.

knackered
01-19-2003, 05:20 AM
I believe there will be an extension to VAR for index arrays soon.

ScottManDeath
01-19-2003, 06:39 AM
Originally posted by knackered:
I believe there will be an extension to VAR for index arrays soon.


Hi

its called NV_element_array and will be in hw on GeforceFX. Its available currently in the NV30 emulation drivers. On htt://www.nvidia.com is the preliminary spec for it.

Bye
ScottManDeath