I’m using jogl 1.1.1 and I have created a texture with data from a monochrome bitmap having 2560x2560 pixels.
The bitmap is stored as a byte[] containing 1 pixel per bit. I.e. the byte[] has 819200 entries (2560*2560/8).
I have a problem when defining the texture size in the glTexImage2D method. I have made a work-around, and copied the byte[] into another byte[] having 2560*2560 entries, and used this very large byte[] as the pixeldata in the glTexImage2D method. This hack displays the bitmap, but the resolution is low and a part of the bitmap is never displayed.
Also, the external data type in glTexImage2D is set to GL_UNSIGNED_BYTE, but isn’t that misleading, since a byte is always signed in java?
How can I tell openGL that my byte[] contains one pixel pr bit and at the same time defining the bitmaps actual size. (I’m not interested in reducing resolution by scaling my bitmap).
Here’s my code:
ByteBuffer bitmap = ... // The original byte[] of size 2560*2560/8
gl.glTexEnvi(GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_MODULATE);
int w = 2560;
int h = 2560;
if (w > aMaxTextureSize) {
w = aMaxTextureSize;
}
if (h > aMaxTextureSize) {
h = aMaxTextureSize;
}
// Wrap the bitmap into a bytebuffer of size 2560*2560
byte[] originalMap = new byte[(w * h)];
bitmap.get(originalMap, 0, bitmap.capacity());
ByteBuffer originalMapBuffer = ByteBuffer.wrap(originalMap);
gl.glBindTexture(GL.GL_TEXTURE_2D, textureName);
gl.glPixelStorei(GL.GL_UNPACK_ALIGNMENT, 1);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_NEAREST);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_NEAREST);
// Map the bitmap image to the texture:
try {
gl.glTexImage2D(GL.GL_TEXTURE_2D,
0,
GL.GL_LUMINANCE8,
w,
h,
0,
GL.GL_LUMINANCE,
GL.GL_UNSIGNED_BYTE,
originalMapBuffer);
}
catch (Exception e) {
aLog.error("Exception when trying to map the bitmap image to the texture. " + e);
}
// Enable texture mapping
gl.glEnable(GL.GL_TEXTURE_2D);