Ati drivers bug ??

Hi !

I have developped a game on a GF2 MX & GF3.
When I have tried it on Radeon 7000 & 9000 textures have become ugly… forced to 16 bits ?!

To be clear: it works on all nvidia cards and not on Ati ones.

I use SDL to initialize the window. Desktop is 32bits and textures are loaded using:
glTexImage2D (GL_TEXTURE_2D, 0, 4 , clt.sdl_surface->w, clt.sdl_surface->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

Window creation is:
SDL_Init(SDL_INIT_VIDEO)
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, color32bits?8:5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, color32bits?8:6);
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, color32bits?8:5 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, color32bits?8:0 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, color32bits?24:16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, color32bits?8:0 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, color32bits?32:16);

Can you see the problem or suggest a solution ?
Cheers,
Bernard.

Its not a bug, its a feature :slight_smile: For some strange reason ATI forces OpenGL by default to do that… Go into the ATI OpenGL control panel and move the “Texture Quality” silder to the rightmost place, it should fix your problem… I guess.

If you specify you want a 32-bit texture by setting the internal format to RGB8, you’ll always get 32-bit textures, even if the texture quality slider isn’t set to max.

>internal format to RGB8
I thought that we were allowed to just set internal format to 1,2,3 or 4… am I wrong ?

Originally posted by nanar77:
>internal format to RGB8
I thought that we were allowed to just set internal format to 1,2,3 or 4… am I wrong ?

3 maps to GL_RGB, 4 to GL_RGBA. Note that these tokens don’t request any particular bit depths, the driver can give you literally anything.

If you want a specific texture depth (and it appears you do), do tell the driver.

Ok !!!

Thanks to all