problem with glDrawArrays

When I specify a value of over 64k as the first element (second param) I get unexpected results, I suspect that it is attempting to treat that int as a short (even though a GLint is defined as a 32 bit int according to my OpenGL headers). Is this due to nVIDIA’s OpenGL driver? Other than splitting up my vertex buffer into two, is there another way to fix this issue?

Well, am managing this about following way and it works fine for me (pseudo code):
if start+count > hardwarefeatures.highestvertexindex
{
setvertexpointer((char*)data+start*vertexsize);
start=0;
};
DrawArrays

But well… in general you should in any case try to make no use of this “trick” as it costs a bit of performance… not that much, but how ever, try to stay within the 65k in any case.

Driver: No, it’s not because of the driver, it’s because that even the GeForce 3 is not able to use higher indices than 65k which is affecting the start and count as well… don’t know, if the Gf4 already supports indices>65k already well but as the Gf4 is nothing else than a little bit pushed Gf3 I don’t think so. The Radeon 8500 does as I know.
Welcome back to the old 16-bit times lol .

BlackJack

[This message has been edited by BlackJack (edited 07-24-2002).]