PDA

View Full Version : Textures do not show up properly on my computer



shishka55
02-17-2005, 08:45 AM
I am having a problem with textures in an opengl class I am taking. When the textures show up they are distorted and unrecognizable. I know the textures should be working as they do on other computers. I think it might be related to my graphics card. I am using Win2k, glut, and have an ATI Rage Mobility 128 graphics card (older laptop).

plasmonster
02-19-2005, 04:32 AM
Hi Shishka55,

I would suggest posting the code, so we can see what's going on, so we can discount the possibility of programmer error ;)

Silkut
02-19-2005, 06:25 AM
graham is right, post the code here, because maybe this is simply an error in vertex texturing coordinates

shishka55
02-19-2005, 12:04 PM
Ok, but this isn't my code... it was written by a teaching assistant for my class, so please do not steal it!!


deleted since its not mineThe thing is, it seems to work on other computers, just not mine, thats why I wonder if its an issue of graphics card compatibility.

02-19-2005, 07:39 PM
For texture enviornment, try this:

glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );

plasmonster
02-19-2005, 08:47 PM
Well, I don't see anything wrong with the code you posted. Though, without seeing all of the code, it's difficult to make a determination.

You might try setting glPixelStorei( GL_UNPACK_ALIGNMENT, 1 ). The wrong alignement can lead to trash in your textures.

Otherwise, take a simple texture demo that you know will work, and go from there. Try to get the texture demo in the RedBook to work, for example.

http://fly.cc.fer.hr/~unreal/theredbook/chapter09.html

If you discover that basic texturing is working, you then know that the problem is elsewhere.

I hope this helps.

02-20-2005, 02:56 AM
Originally posted by shishka55:


//copy image data to texture
if(bpp == 32)
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, &filedata[18]);
else
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGR_EXT, GL_UNSIGNED_BYTE, &filedata[18]);1.) You should check if GL_BGR_EXT and/or GL_BGRA_EXT is supported by the Rage 128.
I think GL_BGRA_EXT is supported but I am unsure about GL_BGR_EXT.

2.) In case its a 24 bit image the internal format should be set GL_RGB8 and not GL_RGBA8.

plasmonster
02-20-2005, 09:34 AM
Hi Honk,

It is a good idea to check for extension support. Here's a great site for this very thing:
http://www.delphi3d.net/hardware/listreports.php

And, as always, be sure to check your extension string.

In the case of EXT_bgra, both the BGR_EXT and BGRA_EXT enumerants are included.

There's no requirement that states that a RGB source format must be matched by a RGB internal format. Granted, the symmetry is pleasing :)

[edit: format]

shishka55
02-20-2005, 01:00 PM
thanks for all the help so far... i tried a few of the solutions but nothing has seemed to work yet. However, I did figure out a little more about the problem. Its occuring when I use textures that are larger than the surface and have to be shrunk. If I stretch a smaller texture it works, and if I only used a fraction of the larger texture so it does not have to be shrunk down it works fine. I do still get some weird blurry texturing on the sides that are almost out of view on a rotating object.

Overmind
02-20-2005, 02:10 PM
I just noticed that the code you posted uses the SGIS_generate_mipmap extension. This extension is supported only on Radeon or Geforce cards, but not on a Rage.

You can either replace the glTexImage call by a gluBuild2DMipmaps call, or turn mipmapping off completely (set the MIN_FILTER to GL_LINEAR instead of GL_LINEAR_MIPMAP_LINEAR).

shishka55
02-20-2005, 04:08 PM
Thank you so much Overmind... that fixed it! I am so glad you guys were able to help me. I am still new to this and this texture issue has been bothering me for awhile. Thanks.