View Full Version : glDrawPixels, a fragment shader and OSX 10.5

James Burgess
02-21-2011, 03:11 PM
I'm using a fragment shader to modify the final pixel values. This is working well and as expected on a variety of cards and OS's (Linux, Windows and Mac's) with one exception. A MacBook Pro with a nVidia 8600 GT running 10.5.8. The exact same hardware/code works with OSX 10.6.

The issue seems to be that my fragment shader is called with gl_Color set to 0,0,0,0 for any pixels that have come from glDrawPixels. Other pixels coming from other drawing calls work just fine.

Any hack for this platform, a fully updated Leopard, accepted. I'm not currently installing any vertex shader if that makes any difference.

Thanks in advance,
- James

02-28-2011, 07:14 AM
Hack: replace DrawPixels with a textured quad.