PDA

View Full Version : can display image, but cant use it as a texture



outRider
08-15-2001, 08:21 AM
The problem I have is that I'm trying to use a bitmap I've created through some GDI functions as a texture, but all I get is a white quad...

When I use glDrwPixels as such:
glDrawPixels(TextureWidth, TextureHeight, GL_RGB, GL_UNSIGNED_BYTE, pTexture);

it works fine, but using glTexImage2D to get a texture object like so:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, TextureWidth, TextureHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, pTexture);

gives me a white quad...
I've enabled texture mapping, but the problem seems to be that I can't use the image as a texture...

It was created using CreateDIBSection() and getDIBits(), and is 384x144x24. I'm using a GeForce so it does support textures larger than 256.

Any ideas anyone?


[This message has been edited by outRider (edited 08-15-2001).]

richardve
08-15-2001, 10:01 AM
Textures should always be size^2, so that's 64x64, 128x128, 256x256 and up..
(or 64x256, 128x32, etc)

You're texture is 384x144 so that's not really correct.
Resize the texture to 512x256 or 256x128 and try it again..

btw. AFAIK you cannot use larger textures than 256x256 on a voodoo card...

outRider
08-15-2001, 10:04 AM
I thought that a texture's dimentions had to be multiples of 16, not powers of 2. Is this not the case?

jxruan
08-15-2001, 10:14 AM
um...
I think the size of textured image should be:
width --- The width of the texture image. Must be 2^n + 2(border) for some integer n.
height --- The height of the texture image. Must be 2^m + 2(border) for some integer m.

Taking your code for example where you want to display a image of size 384*144, the size of texture buffer should be 512*256. That is,
in order to texture-map the image, you should allocate a buffer of size 512*256 and put the image in it. Obviously, your image will not cover the whole buffer because it is much smaller than it. So, you must use 'glTexCoord' to hide the part of buffer other than your image.



Originally posted by outRider:
The problem I have is that I'm trying to use a bitmap I've created through some GDI functions as a texture, but all I get is a white quad...

When I use glDrwPixels as such:
glDrawPixels(TextureWidth, TextureHeight, GL_RGB, GL_UNSIGNED_BYTE, pTexture);

it works fine, but using glTexImage2D to get a texture object like so:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, TextureWidth, TextureHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, pTexture);

gives me a white quad...
I've enabled texture mapping, but the problem seems to be that I can't use the image as a texture...

It was created using CreateDIBSection() and getDIBits(), and is 384x144x24.

Any ideas anyone?

outRider
08-15-2001, 10:59 AM
Thanks for the info. The size of the bitmap is relative to the point size of the font I'm drawing on it, so I dont decide the dimentions before hand. How would I round up to the nearest acceptable texture size?

DFrey
08-15-2001, 11:17 AM
Take the base 2 log (or quick approximation by repeated right shifting), and take the whole part of the result and raise 2 by that power. Then check to see if the value is the same as it was previously. If not, multiply it by 2. Do that for each dimension to find closest powers of 2.

DFrey
08-15-2001, 11:20 AM
Here's some old code of mine:




DWORD CTexture::ClosestPowerOfTwo(DWORD x)
{
if(!x)
return 0;
DWORD n=x;
int c=0;
int s=-1;
while(n)
{
if(n&1)
c++;
n>>=1;
s++;
}
if(c==1)
return x;
DWORD low=PowerOfTwo(s);
DWORD high=low<<1;
DWORD dl=x-low;
DWORD dh=high-x;
return dl<dh ? low : high;

}

DWORD CTexture::PowerOfTwo(DWORD x)
{
DWORD s=1;
while(x &amp;&amp; s)
{
s<<=1;
x--;
}
return s;
}


I can't remember why I didn't just use 1<<x instead of that PowerOfTwo function.

[This message has been edited by DFrey (edited 08-15-2001).]

DFrey
08-15-2001, 11:30 AM
Oh wait I just noticed something about that old code of mine, it finds the closest power of two that is less than, equal or greater than the input. You just want greater so it should be as simple as:



DWORD CTexture::NextGreaterOrEqualPowerOfTwo(DWORD x)
{
DWORD n=x;
int c=0;
int s=0;
while(n)
{
if(n&amp;1)
c++;
n>>=1;
s++;
}
if(c==1)
return x;
return 1<<s;
}





[This message has been edited by DFrey (edited 08-15-2001).]

outRider
08-15-2001, 12:36 PM
Thanks to everyone for their input, it seems to work fine right now.

Now all I have to do is figure out the proper texcoords for each of the characters, thanks though.

[This message has been edited by outRider (edited 08-15-2001).]