PDA

View Full Version : 32 bit colors with ATI



Thomas Krog
08-15-2003, 03:35 AM
I have rendered 32 bit colors with a geforce ti4200, but when I use the same code on a radeon 9600 pro I get 16 bit only.

I have tested the following on the radeon card:
-glGetIntegerv(GL_RED_BITS,&foo);
returns foo==8
-pure GL_LIGTING without texture results in 16 bit on the screen.
-pure texture without lighting results in 16 bit on the screen.

Any ideas how I can render 32 bit colors with the radeon card?
(I guess there is a rendering mode which has different default values on the two cards)

Humus
08-15-2003, 03:42 AM
How do you set the pixel format and are you using fullscreen or windowed mode? Do you switch displaymode if the desktop bpp is less than 32?

Thomas Krog
08-15-2003, 04:44 AM
I use fullscreen mode and I force the bpp to be 32 (I use 32 bpp for my desktop for radeon and geforce).

Here is the source code I use to set the pixelformat (I call ChangeDisplaySettings before I set the pixelformat):

static PIXELFORMATDESCRIPTOR pfd={
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW |
PFD_SUPPORT_OPENGL |
PFD_DOUBLEBUFFER,
PFD_TYPE_RGBA,
bits,
0, 0, 0, 0, 0, 0,0,0,0, 0, 0, 0, 0,
16, // 16Bit Z-Buffer (Depth Buffer)
0,0,
PFD_MAIN_PLANE,
0,
0, 0, 0
};

if (!(PixelFormat=ChoosePixelFormat(hDC,&pfd))){
KillGLWindow(hWnd);
MessageBox(NULL,"Can't Find A Suitable PixelFormat.","ERROR",MB_OK|MB_ICONEXCLAMATION);
return FALSE;
}

if(!SetPixelFormat(hDC,PixelFormat,&pfd)){
KillGLWindow(hWnd); MessageBox(NULL,"Can't Set The PixelFormat.","ERROR",MB_OK|MB_ICONEXCLAMATION);
return FALSE; }

zeckensack
08-15-2003, 05:56 AM
Originally posted by Thomas Krog:
I have tested the following on the radeon card:
-glGetIntegerv(GL_RED_BITS,&foo);
returns foo==8That can't be a 16 bpp mode then. http://www.opengl.org/discussion_boards/ubb/confused.gif

-pure GL_LIGTING without texture results in 16 bit on the screen.
-pure texture without lighting results in 16 bit on the screen.How? What makes you think it's 16 bpp?
Regarding textures, did you request explicitly sized formats (ie GL_RGB8 instead of GL_RGB)?

mattc
08-15-2003, 06:07 AM
not 100% sure but if you request a 16-bit z-buffer you also get 16-bit colour (depending on parameter weighting within ChoosePixelFormat() it may override your rgba value).

btw, why not always request a 32-bit z-buffer anyway? you'll always get the best supported z-buffer bit depth... extra bandwidth requirements are more than worth it imo + current h/w is geared for 32-bit colour and 32-bit z-buffer anyhow (actually, more like 24-bit z + 8-bit stencil) http://www.opengl.org/discussion_boards/ubb/smile.gif

Thomas Krog
08-15-2003, 07:45 AM
Originally posted by zeckensack:

Originally posted by Thomas Krog:
I have tested the following on the radeon card:
-glGetIntegerv(GL_RED_BITS,&foo);
returns foo==8That can't be a 16 bpp mode then. http://www.opengl.org/discussion_boards/ubb/confused.gif

Sorry I was inexact - what I meant was that I rendered 16 bpp to a 32 bpp framebuffer.



How? What makes you think it's 16 bpp?

It was a guess based on the lack of colors on the screen.



Regarding textures, did you request explicitly sized formats (ie GL_RGB8 instead of GL_RGB)?

Thanks a lot that caused the texture problem.

Do you have any idea what could be the problem with the pure lighting using basic materials as shown below:

GLfloat specular[] = {0.2f,0.2f,0.2f,1.0f};
GLfloat diffuse[] = {0.15f,0.25f,0.4f,1.0f};
GLfloat ambient[] = {0.1f,0.1f,0.25f,1.0f};
glMaterialfv(GL_FRONT,GL_SPECULAR,specular);
glMaterialfv(GL_FRONT,GL_DIFFUSE,diffuse);
glMaterialfv(GL_FRONT,GL_AMBIENT,ambient);
glMaterialf(GL_FRONT,GL_SHININESS,500.0f);

Thomas Krog
08-15-2003, 07:56 AM
Originally posted by mattc:
not 100% sure but if you request a 16-bit z-buffer you also get 16-bit colour (depending on parameter weighting within ChoosePixelFormat() it may override your rgba value).

A good guess but it did not solve the problem.



btw, why not always request a 32-bit z-buffer anyway? you'll always get the best supported z-buffer bit depth... extra bandwidth requirements are more than worth it imo + current h/w is geared for 32-bit colour and 32-bit z-buffer anyhow (actually, more like 24-bit z + 8-bit stencil) http://www.opengl.org/discussion_boards/ubb/smile.gif

Yes it might be a good idea to switch to 32 bit z-buffer - I just do not need the extra precision for my current project.

Thomas Krog
08-15-2003, 10:12 AM
I found the problem with the lighting. It was the shininess value which was greater than 128 thus I got a different behavior on the two graphic cards.