Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: Cooperating glTextureStorage3D, glTextureSubImage3D and usampler2DArray

  1. #1
    Junior Member Newbie
    Join Date
    Sep 2017
    Posts
    5

    Cooperating glTextureStorage3D, glTextureSubImage3D and usampler2DArray

    Hi! I am trying to load a 2D texture array to GPU, but I get GL_INVALID_OPERATION on the call in the following code:

    So, the fragment shader has the sampler definition:

    Code :
    layout (binding = 0) uniform usampler2DArray tex_object_x;

    The is the OpenGL code:

    Code :
    const GLsizei levels = 1, width = sampler.width, height = sampler.height, depth = sampler.depth;
     
    glTextureStorage3D(texture_array_buffers[0], levels, GL_RGBA8UI, width, height, depth);
     
    // Fails with GL_INVALID_OPERATION (but not with GL_RGBA instead of GL_RGBA8UI in the operation above)
    for (int sliceindex = 0; sliceindex < depth; sliceindex++)
      glTextureSubImage3D(
        /* GLenum target         */ texture_array_buffers[0],
        /* GLint level           */ 0,
        /* GLint xoffset         */ 0,
        /* GLint yoffset         */ 0,
        /* GLint zoffset         */ sliceindex,
        /* GLsizei width         */ width,
        /* GLsizei height        */ height,
        /* GLsizei depth         */ 1,
        /* GLenum format         */ GL_RGBA,
        /* GLenum type           */ GL_UNSIGNED_BYTE,
        /* const GLvoid * pixels */ &(sampler.slice_z(sliceindex)[0]));

    If I use GL_RGBA instead of GL_RGBA8UI, there is no error, but the data is transferred in a wrong way. This is, I suppose, because I have usampler2DArray, which is unsigned, and GL_RGBA8 in the call to glTextureStorage3D is signed. That's why I am trying to use GL_RGBA8UI.

    Please, help.

  2. #2
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    5,927
    If I use GL_RGBA instead of GL_RGBA8UI, there is no error
    That's not possible. All of the `Storage` functions explicitly require sized internal formats. You will get `GL_INVALID_ENUM` otherwise.

    Also:

    Code :
    /* GLenum target         */ texture_array_buffers[0],
    That's a `GLuint`, not a `GLenum`.

    BTW, you're probably mis-identifying the source of the `GL_INVALID_OPERATION` error. It's likely due to:

    Code :
    /* GLenum format         */ GL_RGBA,
    No, that's wrong. When uploading to integer textures, you must use `_INTEGER` pixel transfer formats. So you would need `GL_RGBA_INTEGER`.

  3. #3
    Junior Member Newbie
    Join Date
    Sep 2017
    Posts
    5
    Quote Originally Posted by Alfonse Reinheart View Post
    That's not possible. All of the `Storage` functions explicitly require sized internal formats. You will get `GL_INVALID_ENUM` otherwise.
    My bad, I meant to write GL_RGBA8 there.

    Quote Originally Posted by Alfonse Reinheart View Post
    When uploading to integer textures, you must use `_INTEGER` pixel transfer formats. So you would need `GL_RGBA_INTEGER`.
    Solved the problem : ) Makes me wonder why _INTEGER is not listed or mentioned here. But the pixel transfer page you provided is very helpful. Thanks a lot, Alfonse!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •