Creating white texture.

Hello,
I’m trying to create an empty texture(white actually) and this is what I’m doing.

data = (unsigned int*)new GLuint[((width * height)* 4 * sizeof(unsigned int))];
;

        for(int i = 0; i < (int)(width * height * sizeof(unsigned int) * 4); i++)
        {
            data[i] = 256;
        }

        // Generate white OpenGL texture.
        GLuint whiteTextureID;
        glGenTextures(1, &whiteTextureID);
        glBindTexture(GL_TEXTURE_2D, whiteTextureID);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

What I don’t understand is why when I texture a quad with this texture the quad does not render. Is there anything that I’m overlooking? I would like to get into creating my own textures by making each pixels myself so I figured that making a white texture would be a good start. Any help is really appreciated.

Thanks,

-r

You write it in memory as integers with value 0x00000100 into a widthheight16 bytes-sized array, and tell OpenGL your data is GL_UNSIGNED_BYTE in a 4 times smaller array.
You’re sending a RGBA value of {0,1/255,0,0}

You need to do this instead:

data = new unsigned char[width * height* 4];
for(int i = 0; i < width * height * 4; i++)data[i] = 255;

Here is the revised code but still I can’t see my quad. This issue is really bothering me.


unsigned char* data;


        // Allocate the needed space.
        int width;
        int height;
        width = height = 128;

        data = new unsigned char[width * height * sizeof(unsigned char)];

        for(int i = 0; i < (int)(width * height * sizeof(unsigned char)); i++)
        {
            data[i] = 255;
        }

        // Generate white OpenGL texture.
        GLuint whiteTextureID;
        glGenTextures(1, &whiteTextureID);
        glBindTexture(GL_TEXTURE_2D, whiteTextureID);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

I don’t get it.

try internalFormat=GL_RGBA8 :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

and your array-size is not widthheight1, it’s widthheight4.

I usually use (for now):
gluBuild2DMipmaps(GL_TEXTURE_2D,4,wid,hei,GL_RGBA,GL_UNSIGNED_BYTE,data);

OK, I’m very sorry. I really hate the fact that the problem was somewhere else. I’m using an special function just to render textures and there was a conflict between that function and another function. The states set in one were affecting the other so if one was used before the other then whatever the other was going to do did not work. Thanks for the help and sorry for wasting your time.

P.S The code above was one of my random attempts. But I did really try your first solution and it works just fine. Thanks!

-r

np, it happens :slight_smile:

Just for the record. Another reason why it was failing was because I was creating the texture before the OpenGL was completely initialized. Noob mistakes :o.

-r