glTexImage2D question

Hello,

I am quite confused with the usage of glTexImage2D. I was using it succesfully with some of my bitmaps. But then in order to test something I just tried to create something as simple as a buffer full of RGB values and feed it to glTexImage2D and see if it renders them correctly.


glTexImage2D(GL_TEXTURE_2D, 0, 3, 2, 2, 0, GL_RGB, GL_UNSIGNED_BYTE, d);

The above should load an RGB image from a data buffer called d. The width and height of the image should be 2. How should the data be inside the buffer? GL_RGB states that each three consecutive bytes should represent an RGB pixel.

So I tried this:


unsigned char* d = (unsigned char*) malloc(12);
for(int i =0; i < 3*4; i+=3)
{
    d[i] = 255; d[i+1] = 0; d[i+2] = 0;

}

Expecting to see a full red texture since we have 2x2 red pixels. Instead I saw 4 different colored pixels. What am I understanding wrong? This was just a test case to test why some of my bitmaps loaded wrong. Any help would be appreciated.

EDIT: I figured it out. I had completely disregarded how openGL unpacks the pixels by default. By using glPixelStorei(GL_UNPACK_ALIGNMENT,1); I corrected it.
So by doing so I said to openGL that a new row starts wherever the previous one stops +1 byte? So no padding exists and it can just read it correctly?

Yes, you found the correct solution.
Default alignment is 4, and works transparently for GL_RGBA8. For example GL_RGBA8 or GL_BGRA8 are often preferred when one need the faster data upload.

By the way, it is better to specify the internalFormat like this :

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, 2, 2, 0, GL_RGB, GL_UNSIGNED_BYTE, d);

As detailed here :
http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml

Thank you very much for the reply zbuffer. It seems there is a lot to optimizing textures, I really like that. I also found this useful link which explains all the other values of internal format. Hope other people find it usefull

I am sorry for making another reply in my topic but I think it is better than making another topic. If this does not get noticed I guess I should make another topic.

So I tried to load some monochrome bitmaps on a texture.

I read the references on glPixelStore(), glTexImage2D,glPixelTransfer and glPixelMap. I thought I should use a Buffer Object binding GL_PIXEL_UNPACK Buffer to do things the “right” way. But I cant seem to make it work correctly and I have no idea what the problem might be.

So basically here is the code.


//t is the data buffer, just 0s and 1s interchanging to see if it prints
unsigned short t[2] = {0xAA,0xAA};
//this is the pixel map
GLushort monoPixelMap[2] = { 0,255};

GLuint bName[1];
glGenBuffers(1, bName);
glBindBuffer(GL_PIXEL_UNPACK_BUFFER,bName[0]);
glBufferData(GL_PIXEL_UNPACK_BUFFER, 4, monoPixelMap, GL_STATIC_DRAW);

//is needed when we use GL_BITMAP to say which side the LSB is
glPixelStorei(GL_PACK_LSB_FIRST,true);



//mapsize here needs to be a power of two or else we get GL_INVALID_VALUE
glPixelMapusv(GL_PIXEL_MAP_I_TO_R,2,0);
glPixelMapusv(GL_PIXEL_MAP_I_TO_G,2,0);
glPixelMapusv(GL_PIXEL_MAP_I_TO_B,2,0);
//I don't think this is needed acording to reference of glPixelTransfer
//glPixelTransferi(GL_MAP_COLOR,true);
//tells how many bits OR bytes? to add when the offset changes (if it's like GL_INDEX_SHIFT) it should be bits.
glPixelTransferi(GL_INDEX_OFFSET,16);
glBindTexture(GL_TEXTURE_2D, texture[0]);
            glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

glTexImage2D(GL_TEXTURE_2D, 0,1, 4, 4, 0,GL_COLOR_INDEX,GL_BITMAP, t);

It does not work, it prints a whole white texture. I tried putting glGetError() after every glCall and checking it and the only place where I get an error is at glTexImage2D and that’s a GL_INVALID_OPERATION.

I guess it must have to do with the fact that I have a GL_UNPACK_PIXEL_BUFFER bound and I am not using it correctly somehow.

Could be one of the following :

GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound to the
GL_PIXEL_UNPACK_BUFFER target and the buffer object’s data store is currently mapped.

        GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound to the
        GL_PIXEL_UNPACK_BUFFER target and the data would be unpacked from the buffer 
        object such that the memory reads required would exceed the data store size.
    

        GL_INVALID_OPERATION is generated if a non-zero buffer object name is bound to the
        GL_PIXEL_UNPACK_BUFFER target and data is not evenly divisible 
        into the number of bytes needed to store in memory a datum indicated by type.

That’s all. Any assistance would be greatly appreciated.

Do you still have invalid operation when you try with this teximage instead ?
glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA8, 4, 4, 0,GL_COLOR_INDEX,GL_BITMAP, t);

I have no experience with 1-bit formats with GL, and it is quite rare, so not sure it this is well supported.

What is your video card ?

Yeah it still does not work zbuffer, thanks for the suggestion.
Is the use of the other functions supposedly correct? Because I did not manage to find a tutorial, just picked these functions after reading the references.

Anyway worst case if the user uploads a monochrome bitmap I will just unpack the pixels manually, it’s not something hard. Of course it would be better if it could be done by OpenGL.

As for my video card it is Nvidia Geforce GT230M

Thing is I noticed GL_BITMAP is deprecated in openGL 4.1 so I guess I should find a totally different way. Just wanted to see if the deprecated way would work.