Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: 64 bit unsigned integer image buffer

  1. #1
    Newbie Newbie
    Join Date
    Feb 2018
    Posts
    1

    64 bit unsigned integer image buffer

    Is there a way to create a 64-bit unsigned integer buffer? For 32-bit uint image buffer, we do the following

    glTexImage2D(GL_TEXTURE_2D, 0, GL_R32UI, WIDTH, HEIGHT, 0, GL_RED_INTEGER, GL_UNSIGNED_INT, NULL);

    I want to use something like GL_R64UI (internal format) with GL_RED_INTEGER (format). I believe there is a way using extensions.
    But so far I've tried with no success.

    Any help is appreciable.

    TIA

  2. #2
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    6,008
    There are many OpenGL extensions that add 64 bit support for various parts of the pipeline. Not one of them adds support for a 64-bit image format. There is no `GL_R64UI` enumerator in OpenGL.

    You can use 64-bit types with UBOs and SSBOs, but that's about it.

  3. #3
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,829
    Quote Originally Posted by AjinkyaGavane View Post
    I want to use something like GL_R64UI (internal format) with GL_RED_INTEGER (format). I believe there is a way using extensions.
    As Alfonse says, none of the extensions provide a 64-bit integer texture format.

    The closest thing would be to split the integer into two 32-bit halves and use GL_RG32UI. Provided that the implementation supports the ARB_gpu_shader_int64 extension, you can then reconstruct the 64-bit value in the shader with e.g.
    Code :
    uvec2 pair = texture(tex, uv).xy;
    uint64_t value = (uint64_t(pair.y) << 32) | pair.x;

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •