Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: GL_BYTE vertex attribute giving wrong data

Hybrid View

  1. #1
    Junior Member Regular Contributor
    Join Date
    Apr 2006
    Location
    Kyoto.
    Posts
    129

    GL_BYTE vertex attribute giving wrong data

    I have a debug text rendering setup which feeds char data to a vertex/geometry/fragment shader.

    On my nvidia gtx460 feeding the char* data using this as the data description:
    Code :
    glVertexAttribIPointer(TEXT_LOCATION, 1, GL_BYTE, 1, 0);
    and having the shader input as
    Code :
    layout(location = TEXT_LOCATION) in int inCharacter;
    works as expected, but on my ATI card the contents of inCharacter seem to be constantly 0;

    As a test i changed the vertex attribute to
    Code :
    glVertexAttribIPointer(TEXT_LOCATION, 1, GL_INT, 4, 0);

    and manually converted the const char* data to const int* data each time i update the VBO and it works as expected.

    Is it OK to feed a vertex shader data in the way I originally intended?
    Is there likely to be some small detail i have missed that would make it not give the correct data on an ATI card?
    This is with GL+GLSL 4.2 on both machines and an AMD A10 APU for the machine with the problem.

    Thank you for any help, James

  2. #2
    Member Regular Contributor Nowhere-01's Avatar
    Join Date
    Feb 2011
    Location
    Novosibirsk
    Posts
    251
    i'm not sure, but according to spec for functions like glVertexAttribPointer, glDrawElements - GL_BYTE\GL_UNSIGNED_BYTE are valid types. but lately, i've encountered very similar issue - they don't work with AMD cards. i was trying to fill ELEMENT_ARRAY(for a simple quad of 2 triangles) with GL_UNSIGNED_BYTE indices and draw it with glDrawElements. it failed with no error. maybe there's special trick to make it work, or maybe AMD are actually follow the spec in this case and i was doing something wrong... i didn't care much cause i didn't NEED to use bytes.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Apr 2006
    Location
    Kyoto.
    Posts
    129
    So i am not alone, Thanks Nowhere-01.

    I had a look through the spec and couldn't see any problems. Maybe this is a driver issue?

  4. #4
    Junior Member Regular Contributor
    Join Date
    Apr 2006
    Location
    Kyoto.
    Posts
    129
    I think i got closer to the cause of the problem.
    I wrote this rather silly vertex shader to test things out:

    Code :
    //in the shader
    #version 420 core
    layout(location = 5) in unsigned int inByte;
    out vec3 colour;
    void main()
    {
        if(inByte==0)
        {
            colour = vec3(1,0,0);
            gl_Position = vec4(-0.5, -0.5, 0 ,1.0);
        }
        else if(inByte==1)
        {
            colour = vec3(0,1,0);
            gl_Position = vec4(0.5, -0.5, 0 ,1.0);
        }
        else
        {
            colour = vec3(0,0,1);
            gl_Position = vec4(0.0, 0.5, 0 ,1.0);
        }
    }
    ...
    //in the C program
    glBindVertexArray(vao);
    glBindBuffer(GL_ARRAY_BUFFER, ab_byte);
    glVertexAttribIPointer(5, 1, GL_UNSIGNED_BYTE, sizeof(unsigned char), BUFFER_OFFSET(0));
    glEnableVertexAttribArray(5);

    i call glDrawArrays(GL_TRIANGLES, 0, 3); with a single VBO filled with const unsigned char [] = {0,1,2}; and it renders as i expect on NV but i get nothing on ATI.

    However if i create a 2nd dummy VBO filled with const float [] = {0,0,0} and set up the VAO like this:

    Code :
    glBindVertexArray(vao);
    glBindBuffer(GL_ARRAY_BUFFER, ab_byte);
    glVertexAttribIPointer(5, 1, GL_UNSIGNED_BYTE, sizeof(unsigned char), BUFFER_OFFSET(0));
    //dummy buffer
    glBindBuffer(GL_ARRAY_BUFFER, ab);
    glVertexAttribPointer(0, 1, GL_FLOAT, GL_FALSE, sizeof(float), BUFFER_OFFSET(0));
    glEnableVertexAttribArray(5);
    glEnableVertexAttribArray(0);
    it then also renders on the ATI card, even though i am not using the dummy VBO data at all in the shader.

    So it seems like my ATI card doesn't like having nothing except byte data.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •