ARGB buffers

Hello,

I have a color GWorld which I’m drawing into an OpenGL port via glDrawPixels(). The problem is that the pixels are stored in ARGB format, and GL wants them in RGBA. Is there any way to get the pixels to display correctly, besides shifting the channels manually? Changing the app to use RGBA natively is not an option.

Also, if anyone has done performance tests comparing glDrawPixels to using textures on the Mac ATI cards, I’d appreciate any info.

Thanks!

The only way to do this is shifting the color channels manually.

I have done an aplication that draws
different things using Quickdraw (including antialiased fonts) on about 25 separate GWorlds, then it converts the GWorlds to textures and draws them on quads and triangle strips.
the GWorlds are drawn & updated every frame and I’m getting about 75 FPS on a 640x480x32 screen.
All the GWorlds are different sizes the largest is 512x256 and the smallest 128x 64.
Im using a 500 mhz G4 with 2 video cards
(ATI rage 128, ATI NEXUS PCI).
My app draws on 2 different Opengl windows at the same time, each window is driven by one card , because of this , i have to bind or replace each texture twice switching the gl context.

In my tests I found that is much faster binding textures or using glTexSubImage with the updated GWorlds and then drawing them on the screen with quads than using glDrawpixels.

Also think that using textures you have a lot more flexibility because you can draw your gworld at any size with billinear filtering.

Hope this is will help you.

It’s probably best to convert the pixel components around manually. Dig for the “swizzle” method I believe in this code sample: http://www.twics.com/~td/base3.03.sit.hqx

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.