Memory Leak on ATI card

Hi,

I’m currently working on a software that display some models in 3D using OpenGL. I have a very strange leak that only appear when using an ATI card (only tested it with radeon 9600 series).

The leak is produce when a call to wglMakeCurrent is made.

Since i’ve supected at first my program i’ve tested with the wxWindow opengl sample, and it leak like my program.

Anyone have the same problem ?

I used the last video driver from ATI (4.10).

I’m pretty sure I’ve run my OpenGL programs on an ATI Radeon card without any trouble. I have found, however, that the choose pixel format function doesn’t work correctly (at least it doesn’t work the way I expect it to) and that systems will select a pixel format that results in a blue screen. I always look through all the available formats, assign a score to each, sort by score, and pick the best. It’s possible that the machine in question is selecting an inappropriate mode for you. I could send you a little code snippet to write out the pixel formats to a file, if that would help.

Hi,

I’ve just reproduced the bug with the first sample of nehe. I’ve just added a call to wglMakeCurrent each rendered frame.
The program leak on radeon 9600 series, 9600 xt, and 9550.
I’ve mailed ATI, they currently have look to the problems.
Thanks for your answer :slight_smile: .

For your pixel format, use the WGL_ARB_pixel_format instead, you will get more control than the DescribePixelFormat method which quite deprecated (you cannot set multisampling for example).

For the wglMakeCurrent, it must be done once between a GetDC / ReleaseDC
Then you will get no problem.
If you are doing more than once between a GetDC / ReleaseDC,then you will get problems;

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.