Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 13

Thread: 8-bits textures

  1. #1
    Intern Contributor
    Join Date
    Feb 2000
    Location
    France
    Posts
    55

    8-bits textures

    Does the 256 colours textures increase the speed of the rendering (maybe the cards are designed for instead of 32bits textures ?). And how can i set the palette of the texture ? I know how to get it, but i don't manage to give the palette to OpenGL. I saw a glGetPaletteEXT or something like that, but I don't know how to use ewtentions. Can anyone help me ?
    Thanks,
    Antoche

  2. #2
    Intern Contributor
    Join Date
    Feb 2000
    Location
    Breda, Netherlands
    Posts
    58

    Re: 8-bits textures

    You can only load RGB/RGBA formatted pictures with OpenGL.
    Thus if you have an 8 bit picture with pallete, pixel 0 with value 13 has color 13 in your pallette, this gives you three values in the range 0..63 (probably).
    red = pal[13].red/63*255;
    green = pal[13].green/63*255;
    blue = pal[13].blue/63*255;

    Hope this helps, (and that I got it correct too)
    John

    [This message has been edited by Sjonny (edited 03-22-2000).]

  3. #3
    Junior Member Regular Contributor fenris's Avatar
    Join Date
    Mar 2000
    Location
    Cincinnati, Ohio USA
    Posts
    129

    Re: 8-bits textures

    Another good way to upload 8-bit textures is to expand the texture to 24-bits (RGB triplets) when you read the texture file.
    You'll wanna read the palette data from the file and store it in a lookup table ( char [256][3] ) and then for each pixel read from the texture, use the lookup table to get the RGB values, and store those in the texture buffer. Then just upload the texture buffer as usual to OpenGL.

  4. #4
    Intern Contributor
    Join Date
    Feb 2000
    Location
    France
    Posts
    55

    Re: 8-bits textures

    So 8bits textures are just a waste of time and color depth. I thought they were faster fot 3d cards rather than 24bpp textures...

  5. #5
    Guest

    Re: 8-bits textures

    my perception was that 8bit textures were good for saving memory, and weren't that much slower to display, probably a negligible difference - because the colortable is loaded into the GPU's memory, so i guess it depends on the card's handling of that

  6. #6
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    782

    Re: 8-bits textures

    There is some bad information above.

    OpenGL does support 8 bit textures natively-- you can use GL_ALPHA or GL_LUMINANCE for greyscale textures (masks, or height maps for example.)

    But you are asking about 8 bit color paletted textures. These are (inexplicably) not supported by native OpenGL. You can convert your 8bit texture to 24bit RGB or 32bit RGBA when you upload it to the GPU during glTexImage2D() by specifying the appropriate format and internal formats. Go read the man page.

    Now, since they are a very useful idea, there are extensions for real paletted textures GL_EXT_paletted_texture and GL_EXT_shared_texture_palette. However they only seem to be implemented by nvidia, not ATI, so they aren't of much use in a real-world application.

    You can also use glPixelMap for some on the fly palette conversion, but this is terribly slow. Or, on very recent cards, you can write a fragment program to do texture lookups with colorization.

  7. #7
    Guest

    Re: 8-bits textures

    I have an nvidia card - how do I use these extensions in my code?

  8. #8
    Guest

    Re: 8-bits textures

    As for every extension, read the offical docs at the OpenGL Extension Registry :
    http://oss.sgi.com/projects/ogl-samp...ed_texture.txt

  9. #9
    Super Moderator OpenGL Guru dorbie's Avatar
    Join Date
    Jul 2000
    Location
    Bay Area, CA, USA
    Posts
    3,947

    Re: 8-bits textures

    8bit textures used to be commonly supported with the palette extension to save bandwidth and memory, but this has been deprecated. DO NOT USE IT. Some of the best new hardware does not support it.

    To save bandwidth and space today the best approach is to use the compressed texture formats supported in hardware.

    Read this for info on how to do this:
    http://developer.nvidia.com/attach/1506

  10. #10
    Senior Member OpenGL Pro
    Join Date
    Oct 2000
    Location
    Fargo, ND
    Posts
    1,755

    Re: 8-bits textures

    Your forgetting another good reason for paletted textures... The age-old art of palette animation... That is... animating an image simply by changing the color of an entry in the palette.
    Deiussum
    Software Engineer and OpenGL enthusiast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •