PDA

View Full Version : No draw to pbuffer (?)



Andru
06-13-2003, 01:18 AM
Hello,

I'm trying to use a pixel buffer using WGL_ARB_pbuffer, running on a Radeon 9800 Pro.
It seems to have initialized fine, as I can clear the buffer with any color using glClear() and check the contents with glReadPixels() ... seems to work fine.

But: I cannot draw to it. No matter what I try, OpenGL doesn't draw to it.

The device context and rendering context are set up fine, that is switched to the pbuffer. (I take it that if they were not, the glClear() would apply to the front window ?)

I've used the example code from http://www.ati.com/developer/ATIpbuffer.pdf as a basis for my code.

The pbuffer is created, initialized and enabled. It *is* there but no draw!

If I pbufferDisable(), drawing works fine to the front window. pbufferEnable()'d and nothing happens (except for glClear() for example which works).


here's my pbuffer type (I've tried different formats too):

---------------------

int iRequiredPixelFormatAttribs[] = {
WGL_SUPPORT_OPENGL_ARB, TRUE, // will be used with OpenGL
WGL_DRAW_TO_PBUFFER_ARB, TRUE, // enable rendering to buffer
WGL_BIND_TO_TEXTURE_RGBA_ARB, TRUE, // will be used as a texture
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,

WGL_RED_BITS_ARB, 8, // number of bits for RED channel
WGL_GREEN_BITS_ARB, 8, // number of bits for GREEN channel
WGL_BLUE_BITS_ARB, 8, // number of bits for BLUE channel
WGL_ALPHA_BITS_ARB, 8, // number of bits for ALPHA channel

WGL_DEPTH_BITS_ARB, 24, // number of bits for depth buffer
WGL_DOUBLE_BUFFER_ARB, FALSE, // double buffering needed or not
0 // zero terminates the list
};

// choose a suitable pixel format.
unsigned int count = 0;
int iPixelFormat;
wglChoosePixelFormatARB(hGLDC, (const int*)iRequiredPixelFormatAttribs, NULL, 1, &iPixelFormat, &count);

---------------------

What's wrong? glClear() works, rendering (drawing) does nothing!

Thanks for any help,

Andru

kansler
06-13-2003, 02:14 AM
Here's a simple pbuffer example with glreadpixels

http://www.codesampler.com/source/ogl_offscreen_rendering_1.zip

Andru
06-13-2003, 02:26 AM
Thanks,

Works fine with that one, wonder why.

Only main difference I see is wglMakeCurrent() versus wglMakeCurrentContext() ... dunno why but seems the Context thingy doesnt work well in the other one.

Anyway, thanks for the help http://www.opengl.org/discussion_boards/ubb/smile.gif

Andru

roffe
06-13-2003, 10:29 AM
Originally posted by Andru:

wglChoosePixelFormatARB(hGLDC, (const int*)iRequiredPixelFormatAttribs, NULL, 1, &iPixelFormat, &count);


Make absolutely sure that you get the correct pixelformat back. Personally I have had bad experience with wglChoosePixelFormatARB so I usually do my own "choosing" by iterating through all of the pfms. NVIDIAs NV Pixel Format program is a great program for checking pfm capabilities. I'm sure it works on ATI hw too. You do check the return of wglCreatePbufferARB right?
http://developer.nvidia.com/view.asp?IO=nvpixelformat


[This message has been edited by roffe (edited 06-13-2003).]

Andru
06-14-2003, 12:42 AM
Well I just checked the created pbuffer's dimensions if they match then it was ok for me. Should've checked the pixel format too, though using glReadPixels() I get a bufferful of correct-seeming data (I glClear()'ed the whole buffer with different data and glReadPixels() them back to see what it did.) Strange that glClear() worked but drawing was not possible.

Anyway I looked at the code kansler pointed me to and got it working. Something to do with setting the RC to the pbuffer it seems... wglMakeCurrentContext() seemed not to work.

Many thanks to both of you http://www.opengl.org/discussion_boards/ubb/smile.gif

Andru