vertex array is not rendering - why?

I have the problem, that my vertex array does not render, even though i turned on glEnableClientState(GL_VERTEX_ARRAY) and the arrays seem to be ok too. Not even a simple glDrawElements(GL_POINT… works…
No crash, no errors, no nothing, i just can’t see anything. The displaylists i’m using are rendering fine…

Is there anything i’m missing here?

And why does the NVOGLNT.DLL always throw an exception when using GL_UNSIGNED_INT instead of GL_UNSIGNED_BYTE in the glDrawElements?

Any help is appreciated!

[This message has been edited by iLLuminatus (edited 02-20-2001).]

You don’t give much information about your problem, more than nothing gets rendered. Give more information, like more code for example.

And if you define an index array like this:

char indexArray = {1, 2, 3, 4, 5…};

Then it will work when using GL_UNSIGNED_BYTE, as you say. Vertex 1, followed by vertex 2, 3, and so on, will be drawn. But when telling OpenGL to use GL_UNSIGNED_INT, it will interpretate the array as an array of 32-bit integers instead. Now, the first value is not 1, but (12^24 + 2s^16 + 3*2^8 + 4), which will point to somewhere about the 16.8 millionth vertex in your array, and that is most certainly outside your array.

Ok here’s the more detailed information:
I’m using a heightfield algorithm that generates pairs of vertices. so my vertexarray contains data like this (i zero the heights(y) here):
0,0,0, 1,0,0, 0,0,1, 1,0,1, 0,0,2, 1,0,2 and so on.
the index contains data like:
0,1,2, 1,2,3, 2,3,4, 3,4,5, 4,5,6 and so on.

(i’ve taken that routine from a heighfield demo that uses triangle strips in a displaylist, that worked fine, but used up lots of memory). Now, even if i use GL_POINTS as rendering method, in my understanding there should at least SOMETHING be visible, shouldn’t it?
Bare with me, i’m not soo experienced with OpenGL at all, this is an exercise project i’m doing.

I think Bob was getting at your data types.
Which data type do you use to store the index
array? (char/unsigned char? int/unsigned int?)
You need to make sure the datatype you
define the index array with to OpenGL matches
the actual size of the indexes in memory.

my datatypes are:

float *vertex_array;
int *index_array;

Doesnt anyone know what the problem is?
Or is there some detailed description of the “workflow” of using vertex arrays anywhere?

It can be severl reasons why nothing shows up. For example, do you setup your viewfrustum properly? Does a simple cube placed at {0,0,0} shows up (tip: use glutWireCube to make sure it actually draws a proper cube)? A black-color-on-a-black-background-effect?

You can try read the chapter about vertex arrays in the Red Book. You find it here: http://ask.ii.uib.no/ebt-bin/nph-dweb/dynaweb/SGI_Developer/OpenGL_PG/

Chapter 2 by the way…

[This message has been edited by Bob (edited 02-21-2001).]

i get exceptions what i do this

enable color array
enable texture coord array
enable vertex array

glVertexPointer(…)
glTexCoordPointer(…)
glDrawElements(…)

bang!! exception in NVOGLNT.dll cause the color array was enabled but it never had a valid pointer