Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 6 of 6

Thread: How do I make my texture 16/32 bit in OGL?

  1. #1
    Junior Member Regular Contributor
    Join Date
    Oct 2002
    Posts
    164

    How do I make my texture 16/32 bit in OGL?

    Hello.
    I was wondering how express the difference between 32 and 16 bit textures in OpenGL.
    I only load TGA's and I think all of my textures are 24 bit. How could I make them 16 for OpenGL at load time?
    Has it to do with the way I read the data or the way I use stuff like GL_RGB (personally I think this only controls my channels)?

    What I want to do in the end is to give the user the ability to force textures into 16bit if they want to.

    Thanks for any help!

    [This message has been edited by B_old (edited 03-15-2003).]
    I may not be good looking but I sure am dumb.

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jun 2000
    Location
    Shreveport, LA, USA
    Posts
    1,505

    Re: How do I make my texture 16/32 bit in OGL?

    Select a 16 bit internal format for use in your glTextureImage call. Like GL_RGBA4, GL_RGB5, or GL_RGB5_A1.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Oct 2002
    Posts
    164

    Re: How do I make my texture 16/32 bit in OGL?

    Ah, OK thank you!
    Do you know any good resource to read about this? I'll try the specs now.
    I may not be good looking but I sure am dumb.

  4. #4
    Junior Member Regular Contributor
    Join Date
    Oct 2002
    Posts
    164

    Re: How do I make my texture 16/32 bit in OGL?

    Hello.
    Could it be that the textures are in 16bit by standard? I did not notice and difference if forcing them to be 16bit but performance did drop when I forced them to be 24bit.

    BTW, do you have a truly 24/32 TGA-file?
    I find it very hard to see any difference between my textures in 16 vs. 24.
    Thanks for the help!
    I may not be good looking but I sure am dumb.

  5. #5
    Senior Member OpenGL Guru
    Join Date
    Feb 2000
    Location
    Sweden
    Posts
    2,982

    Re: How do I make my texture 16/32 bit in OGL?

    If the internal format of the texture is different that the format of the frame buffer, a conversion to match the format of the frame buffer. When you don't specify any bit depth for the internal format, that us, you use GL_RGB, GL_RGBA and so on, the driver generally use the same internal format as the frame buffer and no conversion is needed. If your source is 24 bit and the frame buffer is 16, the texutre data is generally converted to 16 bit, and vice versa.

    Unless you have a really good reason to explicitly specify the internal format, you should really let the driver choose exact format. This means you should stick to the generic formats like GL_RGBA, and not GL_RGBA4 , GL_RGB5_A1 or GL_RGBA8.

  6. #6
    Junior Member Regular Contributor
    Join Date
    Oct 2002
    Posts
    164

    Re: How do I make my texture 16/32 bit in OGL?

    Ah, OK thanks for the hint.
    It's just that I think that I saw an option in some games that let you choose the bit-depth for textures. Why would they do that?
    I may not be good looking but I sure am dumb.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •