[Memory Wasting] OpenGL + GLut

Hello!
I’ve been trying to make a simple “snake” game but I found a few problems that I can’t solve.
When I run the program, it starts consuming the memory and simply doesn’t stop.
After a few hours checking out the code to find any bugs, came out the idea to check one examples from “OpenGL SuperBible” ( 4th edition ) and the given examples has the same problem!

I would like to know if its a problem at my code or it’s just a regular problem at all.


#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include <windows.h>      // Must have for Windows platform builds
#include <gl\gl.h>         // Microsoft OpenGL headers (version 1.1 by themselves)
#include <gl\glu.h>         // OpenGL Utilities
#include <gl\glut.h>

#include <math.h>


struct Position
{
    GLfloat x;
    GLfloat y;
    GLbyte wait;
};

GLbyte xMove = 0;
GLbyte yMove = 0;
GLshort wSize = 1;

struct Position Worm[20];
struct Position Food;

void CreateFood()
{
    srand ( rand() );
    bool accept = true;
    while ( accept )
    {
        bool found = false;
        Food.x = ( rand() % 10 )*5;
        Food.y = ( rand() % 10 )*5;
        for ( int i = 0; i < 20 && Worm[i].wait != -1; i++ )
        {
            if ( Worm[i].x == Food.x || Worm[i].y == Food.y )
            {
                found = true;
                break;
            }
        }
        if ( !found )
        {
            accept = false;
        }
    }

}

// Called to draw scene
void RenderScene(void)
{
   // Clear the window with current clearing color
   glClear(GL_COLOR_BUFFER_BIT);

    glColor3f(1.0f, 0.0f, 0.0f);
    glBegin(GL_QUADS);
       glVertex3f(Food.x, Food.y, 0);
        glVertex3f(Food.x, Food.y+5, 0);
        glVertex3f(Food.x+5, Food.y+5, 0);
        glVertex3f(Food.x+5, Food.y, 0);
    glEnd();

    glColor3f(1.0f, 1.0f, 0.0f);
    glBegin(GL_QUADS);
       glVertex3f(Worm[0].x, Worm[0].y, 0);
        glVertex3f(Worm[0].x, Worm[0].y+5, 0);
        glVertex3f(Worm[0].x+5, Worm[0].y+5, 0);
        glVertex3f(Worm[0].x+5, Worm[0].y, 0);
    glEnd();

    glColor3f(0.0f, 1.0f, 0.0f);
    for ( int i = 1; i < 20 && Worm[i].wait == 0; i++ )
    {
        // Draw the point
       glBegin(GL_QUADS);
          glVertex3f(Worm[i].x, Worm[i].y, 0);
            glVertex3f(Worm[i].x, Worm[i].y+5, 0);
            glVertex3f(Worm[i].x+5, Worm[i].y+5, 0);
            glVertex3f(Worm[i].x+5, Worm[i].y, 0);
       glEnd();
    }

   // Flush drawing commands
   glutSwapBuffers();
}

// This function does any needed initialization on the rendering
// context.
void SetupRC()
{
   // Black background
   glClearColor(0.0f, 0.0f, 0.0f, 1.0f );
   glPointSize(20);
}

void SpecialKeys(int key, int x, int y)
{
   if(key == GLUT_KEY_UP)
    {
      xMove = 0;
        yMove = 5;
    }

   if(key == GLUT_KEY_DOWN)
    {
      xMove = 0;
        yMove = -5;
    }

   if(key == GLUT_KEY_LEFT)
    {
      xMove = -5;
        yMove = 0;
    }

   if(key == GLUT_KEY_RIGHT)
    {
      xMove = 5;
        yMove = 0;
    }
}

void ChangeSize(int w, int h)
{
   GLfloat nRange = 50.0f;

   // Prevent a divide by zero
   if(h == 0)
      h = 1;

   // Set Viewport to window dimensions
    glViewport(0, 0, w, h);

   // Reset projection matrix stack
   glMatrixMode(GL_PROJECTION);
   glLoadIdentity();

   // Establish clipping volume (left, right, bottom, top, near, far)
    if (w <= h)
      glOrtho (-nRange, nRange, -nRange*h/w, nRange*h/w, -nRange, nRange);
    else
      glOrtho (-nRange*w/h, nRange*w/h, -nRange, nRange, -nRange, nRange);

   // Reset Model view matrix stack
   glMatrixMode(GL_MODELVIEW);
   glLoadIdentity();
}

void Start()
{
    xMove = 0;
    yMove = 0;
    wSize = 1;

    for ( int i = 1; i < 20; i++ )
    {
        Worm[i].wait = -1;
    }

    Worm[0].wait = 0;
    Worm[0].x = 0;
    Worm[0].y = 0;

    CreateFood();
}

void MoveWorm ( int value )
{

    for ( int i = 19; i != 0; i-- )
    {
        if ( Worm[i].wait == 0 )
        {
            Worm[i].x = Worm[i-1].x;
            Worm[i].y = Worm[i-1].y;
        }
    }

    Worm[0].x += xMove;
    Worm[0].y += yMove;


    for ( int i = 0; i < 20; i++ )
    {
        if ( Worm[i].wait > 0 )
        {
            Worm[i].wait--;
        }
    }

    if ( Worm[0].x >= 50 || Worm[0].x <= -55 || Worm[0].y >= 50 || Worm[0].y <= -55 )
    {
        Start();
    }

    if ( Worm[0].x == Food.x && Worm[0].y == Food.y )
    {
        printf ( "Worm grow %d
", wSize );
        if ( wSize < 20 )
        {
            Worm[wSize].wait = wSize;
            Worm[wSize].x = Food.x;
            Worm[wSize].y = Food.y;
            wSize++;

            CreateFood();
        }
        else
        {
            Start();
        }
    }

    glutTimerFunc(500,MoveWorm, 1);

    RenderScene();
}

int main(int argc, char* argv[])
{
    Start();
   glutInit(&argc, argv);
   glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
   glutCreateWindow("Snake v0.1");
   glutReshapeFunc(ChangeSize);
   glutSpecialFunc(SpecialKeys);
   glutDisplayFunc(RenderScene);
   SetupRC();
    glutTimerFunc(500,MoveWorm, 1);
   glutMainLoop();

   return 0;
}

PS: Don’t worry about some non-optimized codes, like adding a bitmap and lists with pointers, it’s my first attempt on making a game, I just want it to work.

PS2: I’m using Windows Vista x64

PS3: Sorry about my bad English

Thanks!

I compiled your code identically[*1] as above on my Linux box and it ran fine without any memory issues. Looks like a good start to a new game :slight_smile: The memory did not grow with time. I watched it for about 20 minutes.

That may give a clue … I don’t know what you are seeing, it is not a “regular” problem. Are there any other symptoms?

However, you seem to have a logic problem in CreateFood() somewhere. After you catch 12 or 13 times the worm grows to 13 and the CPU goes to 100% and the game appears to freeze. With the debugger I see that you go into an unwanted infinite loop as follows:


void CreateFood()
{
...
    while ( accept )
    {
 .... infinite loop here, accept never set false
    }

}

This is not an openGL/GLUT bug but rather a logic problem. You may want to look more closely at how you get out of the while loop.

[*1]except I commented out the only OS specific header and replaced the escape characters “” with ansi-standard path separators “/” ie


//#include <windows.h>      // Must have for Windows platform builds
#include <gl/gl.h>         // Microsoft OpenGL headers (version 1.1 by themselves)
#include <gl/glu.h>         // OpenGL Utilities
#include <gl/glut.h>

These minor changes would definately not cause your memory explosion but I make them to make your code ansi-C compatible and to remove OS dependecies not necessary in linux. You can use the ansi-path separators “/” on windows too which helps others ie keeps code cross-platform compatible.

What video card are you using?
Also, how do you know how much memory the app is consuming - task manager can be deceptive sometimes.

@sqrt
GeForce 9800 GTX
I’m using windows task manager… what’s a better one?
About the consume… with task manager it’s something around 4 bytes ( short? ) each glutTimerFunc call.

@marshats
thanks dude!
about that loop i’m still trying to find out why that loop just turn into infinite form ( that happens even with my new code… )
and about the OS dependecies, I’m trying to make it work where i can see it first :stuck_out_tongue:

rather than “task manager” you may want to give MS process explorer a try.

If you have a debugger that may get you to the details faster. It is worth getting to know your debugging tools for just such an occasion :slight_smile:

I would check using the process explorer and look at “private bytes” - if it is increasing then there is probably a problem.

Not that I think you can do much about it, as it seems to be coming from a driver (try updating?) or glut.

As marshats said, you can use your debugger to find memory leaks.

On windows you can do this typically with a simple call to:
_CrtSetDbgFlag ( _CRTDBG_LEAK_CHECK_DF | _CRTDBG_ALLOC_MEM_DF );

as the first line of main(). (Check the msdn docs for more info)

It also seems like the Nvidia drivers may be allocating memory?
http://sourceforge.net/projects/glfw/forums/forum/247562/topic/2020581
http://www.gamedev.net/community/forums/topic.asp?topic_id=465798

I’ll chime in and say that I had a similar experience.
Tweaking the SuperBible examples, memory steady on Linux…

But each callback seems to draw a few Kb on Windows…
But it starts after a while (20 - 30 callbacks), and memory is as steady as in Linux…

Windows (or at least the task manager) is behaving strangely anyway…when I minimize the window, memory usage drops (form 40 to 4Mb or something) and never really grow back when I use the app again…

In the internals of Windows, there is the performance monitor which gives access to much more detailed info than the task manager, could be good to check with that.

@sqrt
Thanks dude!
After updating the nvidia driver / glut it started working ok.

@marshats
Thanks too!

@Topic
Bug solved, thanks!