PDA

View Full Version : Memory Leak - can't find it



SinisterMJ
02-16-2011, 08:11 AM
In following code somewhere I have a leak:

void UpdateGLBuffers()
{
if( g_GLdone ) {
return;
}
g_GLdone = true;
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, 1, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, g_Image[1-g_iCounter].GetData());
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTE R,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTE R,GL_LINEAR);
}

int DrawGLScene(GLvoid)
{
QueryPerformanceCounter( &thisFrame );
int sleeptime = (1000 / 30) - (1000*(thisFrame.QuadPart - lastFrame.QuadPart) / tps.QuadPart); //exchange 30 with new FPS
if( sleeptime > 0)
Sleep( sleeptime );
UpdateGLBuffers();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear The Screen And The Depth Buffer
glLoadIdentity(); // Reset The View
glTranslatef(0.0f,0.0f,-5.0f);

glBindTexture(GL_TEXTURE_2D, texture);

glColor4f(1.0, 1.0, 1.0, 1.0);

glBegin(GL_QUADS);
// Front Face
glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0f, -1.0f, 0.5f);
glTexCoord2f(1.0f, 1.0f); glVertex3f( 1.0f, -1.0f, 0.5f);
glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0f, 1.0f, 0.5f);
glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.0f, 1.0f, 0.5f);
glEnd();

QueryPerformanceCounter( &lastFrame );
return TRUE; // Keep Going
}


and I just can't pinpoint it. Even the lines
glTexParameteri(); seem to leak already. Any known issues or what is happening here?

Dan Bartlett
02-16-2011, 08:46 AM
What do the function calls such as
g_Image[1-g_iCounter].GetData() do? Does it just return a pointer to existing data, or does it do anything extra?
You could try with constant width/height + null data, and see if there's still a problem.
Another guess is that the array index [1-g_iCounter] looks a bit suspicious too, are you sure it's not meant to be [g_iCounter-1].

SinisterMJ
02-16-2011, 08:52 AM
Er, sorry, forgot to mention, that is the camera's API which returns the pointer to the image data in memory. I have already tried disabling the image capture so that it's always the same pointer / data in memory.
The functions are:
Image:
virtual unsigned int GetRows() const;
virtual unsigned int GetCols() const;
virtual unsigned char* GetData();

which has proven so far in command line applications to not leak.

Edit: memory even leaks if I comment the glTexImage2D function out, but if I comment DrawGLScene, no leak. Image acquisition itself doesn't seem to be causing this.
Edit2: the 1-g_iCounter is okay, it's 2 images, and this is some sort of image buffer hack. I've tested with constant row / cols and NULL pointer, it's still leaking.

SinisterMJ
02-16-2011, 09:04 AM
Okay, found the error.

The memory is constantly rising, and gains 1MB of RAM used within 5 minutes, at which it just stops growing, and the memory usage is constant.

I have absolutely no idea where this is coming from, but apparently this is within nVidia driver stack or their OpenGL implementation.

Sorry for the confusion here, never let it ran 5 minutes, since it was just continuously acquiring more RAM :(

mobeen
02-16-2011, 09:39 AM
1) Could u try to comment the UpdateGLBuffers line and see if u still get this. U r creating a texture every frame could try to use fbo instead.
2) The third parameter is the internal format. Shouldn't it be GL_INTENSITY?


glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, g_Image[1-g_iCounter].GetData());

Dan Bartlett
02-16-2011, 10:47 AM
"1" is an allowed value for internal format in the compatibility profile http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml , but has been removed from core


E.2.2 Removed Features
...
Legacy OpenGL 1.0 pixel formats - the values 1, 2, 3, and 4 are no longer
accepted as internal formats by TexImage* or any other command taking
an internal format argument.

but also:


Legacy pixel formats - all ALPHA, LUMINANCE, LUMINANCE_ALPHA, and
INTENSITY external and internal formats, including compressed, floatingpoint,
and integer variants; all references to luminance and intensity formats
elsewhere in the specification, including conversion to and from those formats;
and all associated state. including state describing the allocation or
format of luminance and intensity texture or framebuffer components.

So if you want to use core, you should really use GL_RED.

SinisterMJ
02-17-2011, 04:25 AM
Thanks for the input, after an hour of debugging I noticed that after a few minutes the leaking would stop actually, so this is something else.

Nonetheless I have one additional question:

The cameras here have 8bit, 12bit or 16bit monochrome output, and I want to make the application viable for all 3 colour formats. Now I was able to get 12bit and 8 bit running properly via
8-bit:
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY, cols, rows, 0, GL_RED, GL_UNSIGNED_BYTE, data);

12-bit:
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY12, cols, rows, 0, GL_LUMINANCE12, GL_UNSIGNED_BYTE, data);

but with 16-bit I am failing
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY16, cols, rows, 0, GL_LUMINANCE16, GL_UNSIGNED_SHORT, data);

I get a white image, or rather no image at all. The camera works with 16bit, the manufacturers software displays an image with 16bit mode.

Any idea what I am doing wrong? This is with Win7 x64 and a GeForce9600GT

mobeen
02-17-2011, 04:36 AM
This



12-bit:
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY12, cols, rows, 0, GL_LUMINANCE12, GL_UNSIGNED_BYTE, data);

should use GL_UNSIGNED_SHORT.


12-bit:
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY12, cols, rows, 0, GL_LUMINANCE12, GL_UNSIGNED_SHORT, data);



I get a white image, or rather no image at all.
Have u called glEnable(GL_TEXTURE_2D) before u call glBindTexture and glTexImage2D? I dont see this call in the code snippet posted by u earlier.

SinisterMJ
02-17-2011, 05:12 AM
Okay, replaced with UNSIGNED_SHORT, works fine as well. Not sure why it should be short or byte, since its a MONO12 image, which means 1.5 bytes (neither short nor byte).

The complete code actually looks like

void UpdateGLBuffers()
{
if( g_GLdone ) {
return;
}
g_GLdone = true;
glBindTexture(GL_TEXTURE_2D, texture);
if(g_Color) {
g_Image[1-g_iCounter].Convert(PIXEL_FORMAT_RGB, &g_tempImage);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_RGB, GL_UNSIGNED_BYTE, g_tempImage.GetData());
}
else
{
switch(g_Image[1-g_iCounter].GetPixelFormat()) {
case PIXEL_FORMAT_MONO8: glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_RED, GL_UNSIGNED_BYTE, g_Image[1-g_iCounter].GetData()); break;
case PIXEL_FORMAT_MONO12: glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY12, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_LUMINANCE12, GL_UNSIGNED_SHORT, g_Image[1-g_iCounter].GetData()); break;
case PIXEL_FORMAT_MONO16: glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY16, g_Image[1-g_iCounter].GetCols(), g_Image[1-g_iCounter].GetRows(), 0, GL_LUMINANCE16, GL_UNSIGNED_SHORT, g_Image[1-g_iCounter].GetData()); break;
default: break;
}
}
}

And works for every pixel format except MONO16.

Thanks a lot!

mobeen
02-17-2011, 05:20 AM
And works for every pixel format except MONO16.

If that's the case, I would look closely at the data array (g_Image[1-g_iCounter].GetData()) that is passed to glTexImage2D. Are u sure that the data is correct? may be its padded with zeros? May be u can try adding the following call before the glTexImage2D call


glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

See if it makes any difference?

SinisterMJ
02-17-2011, 06:45 AM
Nope, that did not help. The data itself is valid. I think there's something wrong with the way nVidia's OpenGL handles 16bit (at least I read that somewhere when searching).

Since 16bit image display is not an issue in the first place, I guess this one is resolved.

Thanks for all the input.

V-man
02-18-2011, 06:30 AM
glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY16, width, height, 0, GL_LUMINANCE16, GL_UNSIGNED_SHORT, Data());

should generate GL_INVALID_VALUE or GL_INVALID_ENUM because GL_LUMINANCE16 is not in the book. You need

glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY16, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_SHORT, Data());

SinisterMJ
02-23-2011, 09:35 AM
Thanks for that, with LUMINANCE it's working now for all 3 monochrome modes.
One last question though, why does it work with MONO12? I specify it as GL_UNSIGNED_BYTE, so that'd be too little, but UNSIGNED_SHORT would be too much. As Internal format I set it to GL_INTENSITY12, so my guess is, this overrides the other value, but according to documentation the 3rd value represents how that texture should be used, and not what the data looks like?

Edit: turns out, an error in my code used the default conversion to BGR, so the display of Mono12 actually never worked.