PDA

View Full Version : DIstorted image



Darthganesh
09-19-2010, 01:48 PM
Hi I am trying to display bitmap image in opengl using glut library. this is main code.
int main(int argc,char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
glutCreateWindow("yrBMP");
glutReshapeFunc(OnSize);
glutDisplayFunc(OnDraw);

std::cout<<"Enter filename:";
std::string fname;
std::cin>>fname;
BMPClass bmp;
BMPLoad(fname,bmp);

glEnable(GL_TEXTURE_2D);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTE R,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTE R,GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D,0,3,bmp.width,bmp.heigh t,0,GL_RGB,GL_UNSIGNED_BYTE,bmp.bytes);

glutReshapeWindow(bmp.width,bmp.height);

glutMainLoop();
return 0;
}

void OnSize(int x,int y)
{
glViewport(0,0,x,y);
glDrawBuffer(GL_BACK);

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0,1,1,0);
glMatrixMode(GL_MODELVIEW);
}

void OnDraw()
{
glClear(GL_COLOR_BUFFER_BIT);

glBegin(GL_QUADS);
glTexCoord2d(0,1); glVertex2d(0,0);
glTexCoord2d(1,1); glVertex2d(1,0);
glTexCoord2d(1,0); glVertex2d(1,1);
glTexCoord2d(0,0); glVertex2d(0,1);
glEnd();

glutSwapBuffers();
}
Code works fine with 24 bit bitmap image.
But when I tried it 8bit bitmap images (images using colour palette) .Sometimes I see distorted images and sometimes correct images. I have attached original image and distorted image using printscreen.You can see original image in 1.png and distorted image in 2.png. I m sure that I have read the image data right. I think problem is mostly in display functions used in main.cpp .Please help me. Sorry for my english.

strattonbrazil
09-19-2010, 03:22 PM
First, your English is fine.

Usually when I see those kind of distortions, it means you're reading in the data incorrectly spaced. As you said when you use a 24-bit image you use GL_UNSIGNED_BYTE, which tells OpenGL you have an RGB where each channel is an unsigned byte. When you load an 8-bit image, that is no longer true, is it? You are loading an RGB where each channel is smaller than a byte. You might want to try something like GL_UNSIGNED_BYTE_3_3_2, I believe if you have an 8-bit RGB value.

absence
09-20-2010, 02:31 AM
GL_UNSIGNED_BYTE_3_3_2 won't help with paletted textures. For the hardware to draw the texture correctly, you also need to set the palette. I don't think OpenGL supports paletted textures directly, but there's the EXT_paletted_texture extension. Not sure if modern hardware supports this anymore, or if the driver just converts the data to 24-bit before uploading. If the latter is the case, there's no point in using paletted textures from a video memory consumption point of view.

absence
09-20-2010, 02:39 AM
According to the extension text, paletted textures haven't been supported on NVIDIA hardware for quite some generations. If you really wanted, you could simulate the effect in a fragment shader, but you'd also have to do texture interpolation in the shader and loose some performance. Probably best to just forget about paletted textures and go with 24-bit.