Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: using GL_LUMINANCE16 without loss of data

  1. #1
    Junior Member Newbie
    Join Date
    Nov 2002
    Location
    Bratislava,Slovakia
    Posts
    18

    using GL_LUMINANCE16 without loss of data

    I`m using 3Dtexture to calculate section of volume data.
    I do someting like:
    glTexImage3D(GL_TEXTURE_3D,0,GL_LUMINANCE16 , tWidth, tHeight, tDepth,0,GL_LUMINANCE , GL_SHORT, pImage);

    and I map this texture on single quad and then I do glReadPixels(0,0,Vwidth, Vheight,GL_LUMINANCE,GL_SHORT,(GLshort*)result_ima ge);

    The result look like 16bit data are converted to 8bit calculated and then converted back to short.
    Don`t you have someone an idea how to get back corect data. I cant use Color Index mode because I`m using pbuffer.

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Feb 2000
    Location
    Sweden
    Posts
    2,982

    Re: using GL_LUMINANCE16 without loss of data

    Do you have a frame buffer with 16 bits per channel? Are you sure the texture is actually uploaded as a 16 bit liminance texture, and no as a 8 bit one? You see, the internal format of the texture is just a hint, you can't be sure you get what you ask for. The only thing you can be sure about is that you get a luminance format, but you have little control over the number of bits.

  3. #3
    Senior Member OpenGL Guru zed's Avatar
    Join Date
    Jul 2000
    Location
    S41.16.25 E173.16.21
    Posts
    2,407

    Re: using GL_LUMINANCE16 without loss of data

    what internal format u ask for is not necessary what u get given (if its important check afterwards what u got given)
    IIRC with nvidia cards only about 5 internal formats are supported of the 50 or so + LUMINANCE16 aint one

  4. #4
    Junior Member Newbie
    Join Date
    Nov 2002
    Location
    Bratislava,Slovakia
    Posts
    18

    Re: using GL_LUMINANCE16 without loss of data

    And how can i check what internal formats asre supported by my card?
    In PixelFormatDescriptor I can use just 8bite per color so my frame buffer can not use 16bit
    Result looks like with some overflow.
    I tryed to distribute it in to 2 bytes like 8bits R, 8bits G and B = 0, use 3D texture in RGB mode and than make 16bit data from result again. It was strange but better than luminance 16. So i realy dont know what to do because I need just{ } set corect 16bits data, interpolate it, and get back 16bit data

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •