Stupid pixel formats

OK, DescribePixelFormat gives me some formats that are 3D accelerated, and that use 16-bit color. However, as I try to create a rendering context that uses ANY of these, it fails (invalid parameter). Apparently my 3D card only supports 32-bit color. Why the hell did DescribePixelFormat give me these dumb formats, anyway? Or is it something else that’s being dumb?

Thanks for any clarifications.

-Evan

More possibly helpful information:

Looking at thes “16-bit” pixel formats, I notices this:

CRedBits = 8
CGreenBits = 8
CBlueBits = 8

CRedShift = 16
CGreenShift = 8
CBlueShift = 0

Doesn’t that look like 24-bit color to you? Oddly enough, cColorBits is 16. Perhaps I don’t quite understand what cColorBits and the other color variables are. Can someone explain that to me? The help files (and the book I have) have completely incomprehensible explanations for them.

Thanks again!

-Evan

I have never heard of a card that is only accelerated in 32-bit colors. You can try to post some code here…

Originally posted by mango:
I have never heard of a card that is only accelerated in 32-bit colors.

FireGL1/2/3
(well, actually these cards can work in 32bit only)

Originally posted by Serge K:
FireGL1/2/3
(well, actually these cards can work in 32bit only)

I have found this kind of problem under Borland C++ Builder , and I realized that the system not loaded the opengl32.dll well. I don’t Know if it’s that but try this way before doing something else…

Is your desktop in 32-bit color? Some graphics cards can’t switch between 16 and 32 bit in an application.

Although I don’t know how much that has to do with the pixel format.

j