View Full Version : pbuffers problems, again.

06-18-2004, 11:28 AM
It looks like there is a problem on the nvidia detonator drivers :

when I create more pbuffers than possible (when the card's memory is full I guess), my app gets stuck for a few seconds while drawing inside the last created pbuffer and then I get an error when I use wglBindTexImageARB on it.
Normally, it should fail in wglCreatePbufferARB, which should do something like SetLastError(ERROR_NO_SYSTEM_RESOURCES);

Anyway I did a short application to show that, here is the source :
http://thomasbesson.free.fr/rendertotexture_2004-06-18/ or http://thomasbesson.free.fr/rendertotexture_2004-06-18.zip (vc6 src + bin) if you want to try it.

when you run the app, you can press + / - to add /remove pbuffers inside each other, or use the arrow keys to grow/shrink the pbuffers.

Also on my ATI Radeon 9600 card there is a big constant memory leak, something like 500k/second. I already mailed ATI for that but they are not very responsive.. yeah I know I'm not from Id software :p

Anyone can see any mistake in the source, or how I can work around the problem (like how to know if there is no more memory) ?


06-19-2004, 02:18 AM
Same "wglBindTexImageARB: Win32 error : unknown error" for me. Not apparent memory leak, when adding puffers, ram usage increase, when removing pbuffers, ram usage goes down to initial value.

Geforce3ti200, detonator 45.23

06-19-2004, 11:38 AM

I had similiar problem, driver fails to allocate max possible buffer (PBUFFER_LARGEST) when there is not enough resources. As far as I know this works on R7500 & fails on 4 geforces tested. From threads here at gamedev.net & devfeedback @ nvidia only response was at gamedev (app test on R7500 and GF'es) so I'm intrested is this a driver bug too...

07-24-2004, 07:53 AM
M/\dm/\n, this is not what I was talking about, but I had the same problem. To give more details :

To know what is the biggest size I can use for a pbuffer, I double check with something like :

// The maximum texture size in the current context (I use pbuffes for rendering to texture)
int max_size;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &max_size);

// Check the maximum pbuffer size in the given pixel format
// Do this after wglChoosePixelFormatARB()
int pf_attr[] =
int size[2];
wglGetPixelFormatAttribivARB(_old_hdc, pixel_format, 0, 2, pf_attr, size);
if (width > size[0])
width = size[0];
if (height > size[1])
height = size[1];On ATI (R9600) this works fine, but on nvidia (FX5700) it gives me 4096 for max width and height, but accepts only 2048. I guess this is a bug, but this is not very important for me.