PDA

View Full Version : question about StretchDIBits and textures



Alessandro_dup1
04-26-2004, 02:48 PM
I'm rendering a perlin noise directly to screen using the following instruction:

StretchDIBits(hDC, 0, 0, 256, 256, 0, 0, 256, 256,map, (BITMAPINFO*)&bmih, DIB_RGB_COLORS, SRCCOPY);
Instead, i'd like to render to a texture, so i changed the previous code with the following:


glGenTextures(1, &textures[30].texID);
glBindTexture(GL_TEXTURE_2D, textures[30].texID);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, map);
The resulting texture is completely white. What mistake is in here ? :confused:

map is defined as follows:
static DWORD map[256*256];

ZbuffeR
04-27-2004, 12:53 AM
I see too problems here :

1) glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
This line tells opengl to use mipmaps. But you have to define all of them. Try glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); and see if it helps.

2) static DWORD map[256*256];
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, map);
These two line do not match : you need height*width*ColorComponents*sizeOfOneComponent.

That is : 256*256*3*1 bytes.
The following line should be better :
static GLubyte map[256*256*3];

Alessandro_dup1
04-27-2004, 11:56 AM
Sorry ZbuffeR, i'm a little confused. This is what my perlin code does:


void RenderNoise(HWND hWnd, CPerlin *pPerlin, float z)
{
// Fill the bitmap with the noise
for( int iy = 0; iy < perlin_size; iy++ )
{
// Compute the starting position from the y and z coordinate
float y = iy/16.0f;
float p[3] = { y*Mtx[1] + z*Mtx[2],
y*Mtx[4] + z*Mtx[5],
y*Mtx[7] + z*Mtx[8] };

// This represents movements along the x axis
float x = 1/16.0f;
float d[3] = { x*Mtx[0], x*Mtx[3], x*Mtx[6] };

for( int ix = 0; ix < perlin_size; ix++ )
{
BYTE n = BYTE(255*0.5f*(pPerlin->Noise3(p[0], p[1], p[2]) + 1));

map[ix+iy*perlin_size] = (n<<16) | (n<<8) | n;

p[0] += d[0];
p[1] += d[1];
p[2] += d[2];
}
}

// Render the bitmap to the DC
StretchDIBits(hDC, 0, 0, perlin_size, perlin_size, 0, 0, perlin_size, perlin_size,map, (BITMAPINFO*)&amp;bmih, DIB_RGB_COLORS, SRCCOPY);
}
This procedure shows a gray-scale perlin noise image on the screen, thus it already puts information about the "color" into the map array. If i modify it to a [128*128*3] array, how do i provide color information ?

also, since i'm a newbie, can someone teach me about the difference between a dword and a byte data ?

ZbuffeR
04-27-2004, 01:11 PM
First, did you try to use GL_TEXTURE_MIN_FILTER set to GL_LINEAR instead of GL_LINEAR_MIPMAP_LINEAR ?

From what I recall :
1 byte/1 C char = 8bits
1 word = 2 bytes
1 double word (dword) / 1 int = 4 bytes
1 float = 4 bytes
1 double = 8 bytes

So if you fill a DWORD, it is like a RGBA texel ( 8bits per color channel & alpha). Or you can just fill an array of chars, replace the line

map[ix+iy*perlin_size] = (n<<16) | (n<<8) | n;
by
map[ix+iy*perlin_size] = n;And use GL_LUMINANCE format.
See the doc for glTexImage2d :

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/opengl/glfunc03_16jo.asp