Hi all
I’ve got a Geforce2 GTS, and the new Detonator 12.60 Windows 2000 drivers. I wrote a vertex interpolation vertex program (in OpenGL) to morph between two vertices according to a parameter t (used for keyframe based avatar animations)…and it ran DEAD SLOW! I then cut out as much code as possible to see where the slow down was, and this is where I got too.
My vertex program is very simple …
char VPmesh_deformation [] =
"!!VP1.0 # This is the vertex program.
"
// transform vertices to homogeneous clip space
“DP4 o[HPOS].x, c[0], v[OPOS];”
“DP4 o[HPOS].y, c[1], v[OPOS];”
“DP4 o[HPOS].z, c[2], v[OPOS];”
“DP4 o[HPOS].w, c[3], v[OPOS];”
// move texture pts
“MOV o[TEX0], v[TEX0];”
// set color to (1.0, 1.0, 1.0, 1.0)
“MOV o[COL0], c[5];”
“END”;
#endif
And is called every frame by……
glBindProgramNV(GL_VERTEX_PROGRAM_NV, vpid);
glEnable(GL_VERTEX_PROGRAM_NV);
glVertexAttribPointerNV( 0, 3, GL_FLOAT, 0, vertices );
glEnableClientState( GL_VERTEX_ATTRIB_ARRAY0_NV );
glDrawElements( GL_TRIANGLES, numElements, GL_UNSIGNED_INT, indexes);
glDisableClientState( GL_VERTEX_ATTRIB_ARRAY0_NV );
glDisable(GL_VERTEX_PROGRAM_NV);
By enabling the vertex program, the fps is cut to half!
e.g. rendering normally using element arrays, I get 100fps, and by invoking a vertex program (even though I’m hardly doing any work in the vertex program) it cuts the fps to 50fps.
I can only think its something to do with sending the vertices (e.g. glVertexAttribPointerNV( 0, 3, GL_FLOAT, 0, vertices ) or maybe the vp extension isn’t optimised for my graphics card?
Anyone else doing anything similar, or anyone got an idea of why it’s so damn slow?
Thnx
Lloyd
Charismatic Project, UEA Norwich