View Full Version : glFinish and animation

04-05-2004, 07:43 PM
A simple question... let's say I've got a sequence of events like the following:

void main(void)
while(read_data_from_file()) draw_data();

Is there any danger of the data being overwritten by the read function on the client side while the draw function is still executing on the server end, if the draw does not call glFinish() when it is done.

I'm concerned about drawing calls that use pointers, like vertex pointers or even glVertex3fv(), or glDrawPixels(), where the same pointers are re-read from file.


04-05-2004, 09:42 PM
Hi !

I am not sure what you are trying to do here, you cannot read pointers from a file, a pointer only exists in ram, you can read and write from/to what the pointer points to, but you cannot save the pointer itself (at least not for anything useful).


04-05-2004, 10:58 PM
i assume you are trying to capture the output of the drawing you have done before you close the window, is that correct? in that case, you should call glFinish() prior to closing the window.

04-05-2004, 11:05 PM
I'm not sure what you're asking about but if you supply a pointer to OpenGL when rendering, such as glVertexPointer and friends, you're free to alter the data as soon as your rendering call (e.g glDrawElements) has returned. In general, OpenGL is synchronised (with some exceptions with certain extensions).

04-06-2004, 03:33 AM
sorry folks, I wanted to keep it simple, but unfortunately this means you need to read between the lines :)

What I meant was that, in an animation loop, the contents pointed to by the same pointer are re-loaded from file.

The reason I ask about this is because, when you read the man pages for things like glFinish(), glFlush(), and glXWaitGL(), it suggests that OpenGL commands are buffered, and it is possible for your render routines to return before the actual scene has finished rendering. Unfortunately the glFlush() docs use ambiguous terminology like "finite" time, which is a bit confusing.

It's not a question of capturing the image to file... I'm pretty sure that glFlush() would be sufficient before a glReadPixels() - or, question #2, is glFlush actually required at all before reading pixels?

Back to the original question. I don't want to make any assumptions about the hardware this code is running on. Client and Server could be running on different machines. And the scene could be slow to execute (eg. millions of polygons).

It comes down to this: If I call glDrawPixels() or anything using a pointer, is it safe to change the contents pointed to by the passed pointer "immediately" after the function returns?

I'm guessing the answer is yes, and that anything buffered is actual pixel fragments or somesuch, rather than the data passed to the GL functions.


04-06-2004, 06:31 AM
Unless something in the spec explicitly says "don't modify this buffer", you can assume that OpenGL has copied it. Just be sure in which call the buffers are copied :)

04-06-2004, 01:17 PM
Thanks harsman and hh10k.

Just to make sure... the following functions would never stomp on data that is currently being accessed by buffered GL calls?

void blit_test(void)
GLubyte *pix;
GLsizei w, h, len;
pix = read_image("img.bmp", &w, &h);
len = w*h*3*sizeof(GLubyte);
glDrawPixels(w, h, GL_RGB, GL_UNSIGNED_BYTE, pix);
memset(pix, 0, len); /* ? */

void vert_test(void)
GLfloat *verts;
GLsizei n, len, i;
verts = read_verts("loc.xyz", &n);
len = n*3*sizeof(GLfloat);
for(i=0; i<n; i++) glVertex3fv(&amp;verts[i*3]);
memset(verts, 0, len); /* ? */

04-07-2004, 08:48 AM
dzim77: That should be just fine.