glDrawPixels, a fragment shader and OSX 10.5

Hi,
I’m using a fragment shader to modify the final pixel values. This is working well and as expected on a variety of cards and OS’s (Linux, Windows and Mac’s) with one exception. A MacBook Pro with a nVidia 8600 GT running 10.5.8. The exact same hardware/code works with OSX 10.6.

The issue seems to be that my fragment shader is called with gl_Color set to 0,0,0,0 for any pixels that have come from glDrawPixels. Other pixels coming from other drawing calls work just fine.

Any hack for this platform, a fully updated Leopard, accepted. I’m not currently installing any vertex shader if that makes any difference.

Thanks in advance,

  • James

Hack: replace DrawPixels with a textured quad.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.