Hi !
I have developped a game on a GF2 MX & GF3.
When I have tried it on Radeon 7000 & 9000 textures have become ugly… forced to 16 bits ?!
To be clear: it works on all nvidia cards and not on Ati ones.
I use SDL to initialize the window. Desktop is 32bits and textures are loaded using:
glTexImage2D (GL_TEXTURE_2D, 0, 4 , clt.sdl_surface->w, clt.sdl_surface->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
Window creation is:
SDL_Init(SDL_INIT_VIDEO)
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, color32bits?8:5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, color32bits?8:6);
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, color32bits?8:5 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, color32bits?8:0 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, color32bits?24:16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, color32bits?8:0 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, color32bits?32:16);
…
Can you see the problem or suggest a solution ?
Cheers,
Bernard.