PDA

View Full Version : Creating texture from bitmap (in jogl 1.1.1)



HelenaM
07-06-2010, 01:36 PM
I'm using jogl 1.1.1 and I have created a texture with data from a monochrome bitmap having 2560x2560 pixels.
The bitmap is stored as a byte[] containing 1 pixel per bit. I.e. the byte[] has 819200 entries (2560*2560/8).

I have a problem when defining the texture size in the glTexImage2D method. I have made a work-around, and copied the byte[] into another byte[] having 2560*2560 entries, and used this very large byte[] as the pixeldata in the glTexImage2D method. This hack displays the bitmap, but the resolution is low and a part of the bitmap is never displayed.

Also, the external data type in glTexImage2D is set to GL_UNSIGNED_BYTE, but isn't that misleading, since a byte is always signed in java?

How can I tell openGL that my byte[] contains one pixel pr bit and at the same time defining the bitmaps actual size. (I'm not interested in reducing resolution by scaling my bitmap).

Here's my code:


ByteBuffer bitmap = ... // The original byte[] of size 2560*2560/8

gl.glTexEnvi(GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_MODULATE);

int w = 2560;
int h = 2560;

if (w > aMaxTextureSize) {
w = aMaxTextureSize;
}

if (h > aMaxTextureSize) {
h = aMaxTextureSize;
}

// Wrap the bitmap into a bytebuffer of size 2560*2560
byte[] originalMap = new byte[(w * h)];
bitmap.get(originalMap, 0, bitmap.capacity());
ByteBuffer originalMapBuffer = ByteBuffer.wrap(originalMap);

gl.glBindTexture(GL.GL_TEXTURE_2D, textureName);

gl.glPixelStorei(GL.GL_UNPACK_ALIGNMENT, 1);

gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_NEAREST);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_NEAREST);

// Map the bitmap image to the texture:
try {
gl.glTexImage2D(GL.GL_TEXTURE_2D,
0,
GL.GL_LUMINANCE8,
w,
h,
0,
GL.GL_LUMINANCE,
GL.GL_UNSIGNED_BYTE,
originalMapBuffer);
}
catch (Exception e) {
aLog.error("Exception when trying to map the bitmap image to the texture. " + e);
}

// Enable texture mapping
gl.glEnable(GL.GL_TEXTURE_2D);

ZbuffeR
07-06-2010, 02:10 PM
Try to read the docs :
http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml
It sounds like you want something like that :
gl.glTexImage2D(GL.GL_TEXTURE_2D,
0,
GL.GL_LUMINANCE8,
w,
h,
0,
GL.GL_COLOR_INDEX,
GL.GL_BITMAP,
bitmap );


If type
is GL_BITMAP, the data is considered as a string of unsigned bytes
(and format must be GL_COLOR_INDEX).
Each data byte is treated as eight 1-bit elements,
with bit ordering determined by GL_UNPACK_LSB_FIRST
(see glPixelStore).



GL_COLOR_INDEX


Each element is a single value,
a color index.
The GL converts it to fixed point
(with an unspecified number of zero bits to the right of the binary point),
shifted left or right depending on the value and sign of GL_INDEX_SHIFT,
and added to GL_INDEX_OFFSET
(see glPixelTransfer).
The resulting index is converted to a set of color components
using the
GL_PIXEL_MAP_I_TO_R,
GL_PIXEL_MAP_I_TO_G,
GL_PIXEL_MAP_I_TO_B, and
GL_PIXEL_MAP_I_TO_A tables,
and clamped to the range [0,1].


Read this too, to map index 0 and 1 to black and white (for example) :
http://www.opengl.org/sdk/docs/man/xhtml/glPixelMap.xml

HelenaM
07-09-2010, 01:34 AM
Thank you for the answer! I will read the suggested links!
/Helena

atteal
08-30-2011, 03:59 AM
Did you have any good results now?
I have a related issue. I attempted to create a texture from binary data using:

unsigned char bin_data[] = {0xff, 0xff, 0x00, 0x00, 0xff,0xff, 0xff, 0xff};
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, 8, 8, 0, GL_COLOR_INDEX, GL_BITMAP, bin_data);

such that it is 8x8 texture with each byte in bin_data representing a row in the texture as it is considered as 8 bits ie 8 pixels using GL_BITMAP,as type. However when i am mapping the texture i am getting a solid colored rectangle with the colour set last by glcolor().

Is the above correct? are there any specific parameters to be set?

ZbuffeR
08-30-2011, 04:38 AM
check glGetError after each gl call, and post here what errors you got.

knackered
08-31-2011, 04:09 AM
http://www.khronos.org/opengles/sdk/1.1/docs/man/glPixelStorei.xml

atteal
08-31-2011, 06:15 AM
i used the glGetError() but there were no errors i am using the code below:

glEnable( GL_TEXTURE_2D ); // Enable texture mapping

GLubyte ds_pattern2[] = {0x00, 0xFF, 0x00, 0xFF,
0x00, 0xFF, 0x00, 0xFF};
glShadeModel(GL_FLAT);

glPixelTransferi(GL_MAP_COLOR, true);
GLuint map[] = {0, 255};
glPixelMapuiv(GL_PIXEL_MAP_I_TO_I, 2, map);
glPixelMapuiv(GL_PIXEL_MAP_I_TO_R, 2, map);
glPixelMapuiv(GL_PIXEL_MAP_I_TO_G, 2, map);
glPixelMapuiv(GL_PIXEL_MAP_I_TO_B, 2, map);
glPixelMapuiv(GL_PIXEL_MAP_I_TO_A, 2, map);

//how to get the pixel data from memory
i.e taking 1byte for each pixel value

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

//generate texture name
glGenTextures(1, &texbin2);

//create texture object for texture data assigned to the name in glGenTextures()
glBindTexture(GL_TEXTURE_2D, texbin2);

//sets parameters to the texture object
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);

//defines the texture
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, 8, 8, 0, GL_COLOR_INDEX, GL_BITMAP, ds_pattern2);


glEnable(GL_TEXTURE_2D);
//glColor4ub(255, 0, 0, 255);
glBegin(GL_QUADS);
glTexCoord2i(0, 1); glVertex2i(0,150);
glTexCoord2i(0, 0);
glVertex2i(0, 250);
glTexCoord2i(1, 0);
glVertex2i(100, 250);
glTexCoord2i(1, 1);
glVertex2i(100, 150);
glEnd();

glFlush();