Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: Vertex shader input not getting passed out correctly

  1. #1
    Junior Member Newbie
    Join Date
    Mar 2018
    Posts
    4

    Vertex shader input not getting passed out correctly

    I'm trying to use texture arrays in my shaders, and I'm having this really weird issue.
    Here are my shaders:

    Vertex Shader:
    Code :
    #version 430 core
     
    layout (location = 0) in vec3 aPos;
    layout (location = 1) in vec2 aTexCoord;
    layout (location = 2) in mat4 aModel;
    layout (location = 6) in int aTexIndex;
     
    out vec2 texCoord;
    flat out int texIndex;
     
    uniform int useInputModel;
    uniform mat4 model;
    uniform mat4 view;
    uniform mat4 projection;
     
    void main() {
    	mat4 transform;
    	transform += aModel * useInputModel;
    	transform += model * (1-useInputModel);
    	gl_Position = projection * view * transform * vec4(aPos, 1.0);
    	texCoord = aTexCoord;
    	texIndex = aTexIndex;
    }

    Fragment Shader:
    Code :
    #version 430 core
     
    out vec4 fragColor;
     
    in vec2 texCoord;
    flat in int texIndex;
     
    uniform int useArray;
     
    layout (binding = 0) uniform sampler2D useTexture;
    layout (binding = 1) uniform sampler2DArray textures;
     
    uniform vec2 texOffset;
    uniform vec2 texScale;
     
    void main() {
    	fragColor = texture(useTexture, (texCoord * texScale) + texOffset) * (1-useArray)
    			  + texture(textures, vec3((texCoord * texScale) + texOffset, texIndex)) * useArray;
    }

    For some reason, the aTexIndex attribute is not getting passed to the fragment shader correctly. I've tested a few values and here are the results:
    aTexIndex texIndex
    1 1065353216
    2 1073741824
    3 1077936128
    4 1082130432
    I am checking values with renderdoc.

    Edit:
    I believe those numbers are actually the binary floating point representation of the values, forced to output as integers.
    Last edited by Saftur; 04-03-2018 at 06:54 PM.

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    3,007
    Quote Originally Posted by Saftur View Post
    I believe those numbers are actually the binary floating point representation of the values, forced to output as integers.
    That is correct. My first guess is that you're using glVertexAttribPointer() when you should be using glVertexAttribIPointer() for an integer attribute.

  3. #3
    Junior Member Newbie
    Join Date
    Mar 2018
    Posts
    4
    Quote Originally Posted by GClements View Post
    That is correct. My first guess is that you're using glVertexAttribPointer() when you should be using glVertexAttribIPointer() for an integer attribute.
    I just got this answer from someone I know IRL. I didn't even know there was a special function for integers, I just assumed that since glVertexAttribPointer took a type parameter, it was the only function needed. I'm completely new to OpenGL.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •