glDrawPixels problem

Hi everyone,

I have a very strange problem. I am using the glDrawPixels in my application to draw bitmaps to the screen. Nothing special. I have an Intel graphics card and everythign works fine.

I switch to another machine that has an ATI radeon 9000 card. Now, all the bitmaps work fine, except one! All of them are 24 bit bitmaps and they display correctly, except one! I found nothing strange with the bitmap and I have absolutely no clue, why it should display differently!

I am attaching the link to the images (the original image and how it displays on the screen). If anyone has ever come across anything like this, please help me! I would be eternally grateful!

The original bitmap is:

It gets displayed as:

Another strange thing I noticed is that it displays correctly when I slow the 3D acceleration down a bit on the ATI card! I have no idea what is going on!

If someone could help, I would be greateful. Is there an alternative to the glDrawPixels function that I could try?

Thanks and cheers,
xargy

I can send the original bitmap if someone wants. Just email me at deluded.soul@gmail.com

hm…there is no link. when i press the reload-button, i see 2 image symbols, but they disappear quickly.

Sorry, Yahoo deleted my images. Here they are again!

Sorry again! Boy, it is hard to keep images somewhere on thw web!

My webpage

The top post on my blog has the images!

the original image seems to be ok. maybe you should post some code. especially the piece around the glDrawPixels call. do you use GL_RGB or maybe GL_RGBA? since the image is a jpg, how do you load it, how do you store it in memory?

Hi,

The code does not do anything special. The extension I use is GL_BGR_EXT as the bitmap is created as DIB.

The funky thing is that it draws fine on my card and it draws fine on the other card if I do not have 3D acceleration on.

Unfortunately, I do not have access to the other PC now.

Do you think that a call to glPixelStorei might help?

The original image is a bitmap. I had to pist it online as a JPG because the website would not allow me to have a bitmap. I can send you the original image, if you email me at deluded.soul@gmail.com

Cheers,
xargy

Unfortunately, I do not acces to the machines to test this.

However, I read that the window sbitmap must start each row on a 4 byte boundary. So, it adds padding to the bitmap data. Now this bitmap is 98 rows (294 bytes / row). So it probably adds a 2 byte padding at the end.

I do not call glPixelStorei in my code. However, I read that most systems have it set to 4 by default. I have to check this on Monday though.

Do you think this explanation could make some sense?

Cheers,
xargy

Hi guys,

I tried a lot of things but to no avail. My bitmap is 24 bit per pixel and 98 pixels wide and 25 pixel high…

I tried:

glPixelStoref(GL_UNPACK_ALIGNMENT, 1);
glDrawPixels(bmp.bmWidth, bmp.bmHeight, 
GL_BGR_EXT, GL_UNSIGNED_BYTE, bmp.bmBits);

But the bitmap still does not align properly.

I also tried converting it to a 32 bit per pixel bitmap in code and then trying to display it:

for (int i = 0; i < bm.bmHeight * bm.bmWidth; i++)
{
	memcpy(start, source, 3);

	start +=  3;				
	source += 3;				

	memset(start, 0, 1); // 4th byte	
	start += 1;
}

and then drawing it like:

glDrawPixels(bmp.bmWidth, bmp.bmHeight, 
GL_BGRA_EXT, GL_UNSIGNED_BYTE, paddedData);

Still it does not render correctly. If I scale the image down with MS PAINt from 98 pixels to 96 pixels, it displays it ok.

Please help!

Cheers,
xargy

Screenshot at the top post on my blog:

http://superkampfer.blogspot.com/

Cheers,
xargy

This is a bit of a guess, but could the way the card handles images be causing the issue? Some OpenGL implementations ‘like’ image data padded to 4 or 16 byte boundaries, which seems to correlate with your comment about reducing from 98 to 96 pixels.

It’s a complete guess, but I’ve noticed similar issues in other code.

Ok, so here is the story so far:

I have the bitmaps displaying correctly, if I do the following:

Go through the bitmap data, and bad each pixel with another byte. When I hit the end of the line, I ignore the last 2 bytes (98 pixels wide, each line is apdded with 2 bytes). This is the
padding to make the entire line length divisible by 4.

Only trick to find out is how to tell OpenGL to do the same on some cards without me having to do it!

xargy