dumb problem

Hello
I’m trying to draw an image using textures.
But somehow it doesn’t draw the image
I’ve got this init code:

glClearColor(0.0, 0.0, 0.0, 0.0);
glShadeModel(GL_FLAT);
QImage plaatje;
plaatje.load("plaatje.png");
cout << "Width: " << plaatje.width() << endl;
cout << "Height: " << plaatje.height() << endl;
cout << "Pixels: " << plaatje.bits() << endl;
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, plaatje.width(), plaatje.height(), 0, GL_RGB, GL_UNSIGNED_BYTE, plaatje.bits());
glEnable(GL_TEXTURE_2D);
glEnable(GL_TEXTURE_GEN_S);
glEnable(GL_TEXTURE_GEN_T);
glTexGeni(GL_S, GL_TEXTURE_GEN_MODE, GL_OBJECT_LINEAR);
glTexGeni(GL_T, GL_TEXTURE_GEN_MODE, GL_OBJECT_LINEAR);

drawing code:

glBindTexture(GL_TEXTURE_2D, tex);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glBegin(GL_QUADS);
/*glTexCoord2f(0.0, 0.0);*/ glVertex2f(-2.5, -2.5);
/*glTexCoord2f(0.0, 1.0);*/ glVertex2f(-2.5, 2.5);
/*glTexCoord2f(1.0, 1.0);*/ glVertex2f(2.5, 2.5);
/*glTexCoord2f(1.0, 0.0);*/ glVertex2f(2.5, -2.5);
glEnd();

QT seems to load the image perfectly(it shows the right width, height and probably the pixels too).
The width and height of the image can be divided by 2.
It will probably some stupid line that i forgot or something similar, but i can’t find.
So can somebody please help me, and say what’s the problem?
Thanx #ma*

Put glGetError(); and watch for returned value after each GL command, it will help you.

And about image size, if must be power of 2, like 64, 128, 256, 512, 1024 etc but not in between. 196 is not valid.

Hylke > did your compiler say any error ?

Originally posted by ZbuffeR:
[b]Put glGetError(); and watch for returned value after each GL command, it will help you.

And about image size, if must be power of 2, like 64, 128, 256, 512, 1024 etc but not in between. 196 is not valid.[/b]
I gues 160 is not valid then, i’il try to change it to 256 first.
EDIT:
After changing GL_RGB to GL_RGBA in glTexImage2D I now see the image, but it’s blue instead of red.
What could cause that problem?
and gollum, i didn’t got any errors(not even warnings :-))
Thanx Hylke

Needs alpha and is blue instead of red ? It all depends on what means the .bits() part of the QT image.
The pixeldata format is defined by the ‘format’ argument (different from internal format).

Replace GL_RGBA as format by GL_BGRA. The Red and Blue channels will be swapped.

If the texture still show up even if dimentions are not power-of-two, you are lucky to have a card supporting GL_ARB_texture_non_power_of_two

Originally posted by ZbuffeR:
[b]Needs alpha and is blue instead of red ? It all depends on what means the .bits() part of the QT image.
The pixeldata format is defined by the ‘format’ argument (different from internal format).

Replace GL_RGBA as format by GL_BGRA. The Red and Blue channels will be swapped.[/b]
Thanx, GL_RGBA fixed it.