Graphic Card crashs using glut

Hi,

I have a strange phenomen: I wrote a very simple OpenGL program using glut.
it compiles, and everything is fine.
then i start it. all is fine yet.
after i start this program the second time, the display freeze and the only key which works is the reset button…

i hope you have some suggestions…

greetings
Peter

Could you give us some code to analyse ?

Have you installed Mesa yourself or was it part of your distribution (if you have XFree 4.X you have GLUT, I think) ?

Ok, here is the code:
you see, there is no more simple code…
i don’t use mesa, so i think it is the other way.

#include <GL/glut.h>

#include <stdio.h>
#include <iostream>

using namespace std;

float angle = 0.f;

void initGL(){

glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);

glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);

}

void keyboard(unsigned char key, int x, int y)
{
switch(key){
case 27:
glClear(GL_ACCUM_BUFFER_BIT |GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT | GL_COLOR_BUFFER_BIT);
exit(0);
break;
default:
break;
}
}

void display()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);

glTranslatef(0.f,0.f,-10.f);
glRotatef(angle, 0.f, 0.f,1.0f);
angle = 0.2f;

glBegin(GL_QUADS);
glVertex3f(0.f, 0.f, 0.f);
glVertex3f(1.f, 0.f, 0.f);
glVertex3f(1.f, 1.f, 0.f);
glVertex3f(0.f, 1.f, 0.f);

glVertex3f(0.f, 0.f,0.f);
glVertex3f(0.f, 1.f, 0.f);
glVertex3f(0.f, 1.f, 1.f);
glVertex3f(0.f, 0.f, 1.f);

glEnd();

glFlush(); //glutSwapBuffers();

}

void reshape(int w, int h)
{

glViewport(0, 0, (GLsizei) w, (GLsizei) h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
if(h&lt;=0)
    h=1;

gluPerspective(45.0, (GLfloat) w/ (GLfloat) h, 1.0, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

}

int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitWindowSize(800,800);
glutInitDisplayMode(GLUT_SINGLE | GLUT_DEPTH | GLUT_RGB );
glutCreateWindow(“Double Buffer test”);

initGL();

glutIdleFunc(display);
glutReshapeFunc(reshape);
glutKeyboardFunc(keyboard);

glutDisplayFunc(display);
glutMainLoop();

return 0;

}

greetings
Peter

Ok, lets try in this way. First start with replacing glFlush() with glutSwapBuffers(), or put first glFlush() and then glutSwapBuffers().

Put in code include gl.h and glu.h.
“glClear(GL_ACCUM_BUFFER_BIT |GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT | GL_COLOR_BUFFER_BIT)” you dont need this long code, 'cos you are not using accum. and stencil buffers.

For now you wont need this “glutIdleFunc(display);” so you can remove it freely.

PS.
if it doesn’t work again, try when compiling to link with Mesa libs.

Your display mode may be wrong here :

glutInitDisplayMode(GLUT_SINGLE | GLUT_DEPTH | GLUT_RGB );

don’t try to use glutSwapBuffers() if you only use one buffer.

try this :

glutInitDisplayMode(GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB );

if it doesn’t work… well try Mesa but I don’t think it resolve anything.
If it still don’t work, well… try SDL it works a lot better and gives much more possibilities.

Well, without the glutIdleFunc it runs perfectly, but why? can i only see stills with my opengl? :-).
I know that with the SINGLE / DOUBLE state in the initdisplaymode, i just testet with both variants.

greetings
Peter

Hi,

“I know that with the SINGLE / DOUBLE state in the initdisplaymode, i just testet with both variants.”

well, this is two different states, with GLUT_DOUBLE you initialse depth buffer, and with GLUT_SINGLE not. I think that it says a lot

In code, you displayed picture from memory two times, one in glutIdileFinc(), and second in glutDisplayFunc(), i think that crash window.

glutIdileFunc() check is there any signal from keyboard, or mouse, or even moving window (for this last im not shure), i think that you shoud look in GLUT documentation.

Anyway, without glutIdileFunc(), GLUT window works fine.

I know of no other differences between GLUT_SINGLE and GLUT_DOUBLE than the difference between single and double buffer mode. The reason the program does not respond very well in single buffer mode is that many frames is stored waiting to get drawed. If you replace glFlush with glFinish should it respond much quicker. NVidias driver stores no more than one frame in double buffer mode. See this thread http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/004816.html

If you do not use Mesa do I guess that you have a NVidia card. The most likely explanation is that the AGP support is not working. Read the documentation in /usr/share/doc/NVIDIA_GLX-1.0/ that came with the driver. The easy way is just to disable AGP but that will cost some performance.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.