Displaying a tiled bitmap in OpenGL

I have broken up a very large bitmap into tiles to get around the OpenGL bitmap size limitation. When I display the tiled bitmap, even though I am sure that the tiles are tightly abutting each other, on only some machines the bitmap displays with thin black lines around many of the the tile borders. Different lines appear and disappear as I zoom in or out of the scene. This only happens on a few of our latest machines (none of which share the same graphics cards).

What’s up with these ugly lines? Can someone please help me?

May be you use the wrong texture wrapping mode. Try this:

::glTexParameteri(texture_id,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
::glTexParameteri(texture_id,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);

Originally posted by Inquisitor:
[b]May be you use the wrong texture wrapping mode. Try this:

::glTexParameteri(texture_id,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
::glTexParameteri(texture_id,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);

[/b]
CLAMP_TO_EDGE is the wrong mode for tiling if he is using bilinear filtering. He wants to use GL_CLAMP and textures with border, where the border is the neighbour row/column from the neighbour tile.

Do graphics cards still have problems like that? You would think by now things like that would
have been taken care of. I think trivial things like non-powers of 2 textures and seamless textures
should be basic stuff covered on all cards these days.

to evanGLizr:

He said he broke up a large texture into tiles. That sounds to me as if every tile is one texture. So no boarder etc. In that case GL_CLAMP_TO_EDGE should be the better choice.

If you use more tiles per texture you are right of course.

Thanks guys. The problem has been fixed with CLAMP_TO_EDGE as suggested by Inquisitor.

Originally posted by Inquisitor:
[b]to evanGLizr:

He said he broke up a large texture into tiles. That sounds to me as if every tile is one texture. So no boarder etc. In that case GL_CLAMP_TO_EDGE should be the better choice.

If you use more tiles per texture you are right of course.[/b]
He said he broke a “very large bitmap” into tiles to “get around the OpenGL bitmap size limitation”. That sounds exactly like having one big image and tiling it in such a way that each tile is a texture.

If you tile your image in tiles and every tile is one texture, when you do the bilinear filtering of neighbouring tiles, unless the textures have borders and you use GL_CLAMP, you will get the wrong texels being filtered. That’s why the right border of a texture/tile has to contain the first column of the right neighbouring tile (and the left border of a texture/tile has to contain the last column of the left neighbouring tile).

That’s exactly the reason why textures with border and GL_CLAMP exists, so you can tile big images into tiles and store each tile in a texture.

He may think that it’s working, but it’s actually not, the bilinear filtering of neighbouring tiles is going wrong. This may be important for him or not, depending on the image, but it’s still wrong.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.