Hello all,
I’m using OpenGL since 1.0 a long time ago, but last year I started to change my OGL 2.0 code to OGL 3.2.
I have an Athlon x2 4800+ and a GeForce 8600GT, with the lastest driver installed.
All seems to be working fine, fbo, vertex buffers, but there is only one strange thing that makes my app crash every time:
I have a stream vertex buffer that I upload to every frame the 2d interface information, in an easy cpu format to prevent a lot of conversions to float in my code,
typedef struct {
unsigned int posx, posy;
unsigned short texCoordx, texCoordy;
unsigned byte color[4];
}
vertex2d;
The vertex buffer is 2048 * sizeof( vertex2d ), and I have a loop to draw in blocks if the number of vertex exceeds this 2048 limit (what I consider very big buffer, cause I’m using a max of 100, 200 vertices for now)
I have a very simple vertex/pixel shader that I use that multiply the position by an ortho matrix, get the pixel in the texture and modulate with the vertex color.
I use to render:
// vbo = vertex buffer
// tex = texture for the widget
// kVertexIndex = 0
// kTexCoordIndex = 2
// kColorIndex = 3
glBindBuffer( GL_ARRAY_BUFFER, vbo );
//glEnableVertexAttribArray( kVertexIndex );
glVertexAttribPointer( kVertexIndex, 2, GL_UNSIGNED_INT, GL_FALSE, sizeof( vertex2d ), nullptr );
//glEnableVertexAttribArray( kColorIndex );
glVertexAttribPointer( kColorIndex, 4, GL_UNSIGNED_BYTE, GL_TRUE, sizeof( vertex2d ), ( void* )12 );
//glEnableVertexAttribArray( kTexCoordIndex );
//glActiveTexture( GL_TEXTURE0 );
glVertexAttribPointer( kTexCoordIndex, 2, GL_UNSIGNED_SHORT, GL_TRUE, sizeof( vertex2d ), ( void* )8 );
There is no index buffer enabled, none of this functions return error.
If I draw the vertices with glDrawArrays, the color is not modulated and everything works besides that (only the texture color is shown).
If I instead enable the vertices attribs, the kColorIndex call crash my app
The color attribute is used in the shader, the glGetAttribLocation return a valid value for it =
I think that I covered all the points that may be a problem or an wrong usage of the API, I don’t know where more I can look at…
Other think that makes me confuse is the glEnableVertexAttribArray(). The specs says that we need to enable and disable the indices, but I can render anything without calling this function, maybe the driver knows what indices are enable for a given shader anytime.
Any ideas?
Thank you