Boreal

03-05-2013, 02:53 AM

So I have recently been reading through the Wikibook on Modern OpenGL, and am looking to update my OpenGL code to conform to the more modern standards. Everything goes well until I hit the part where I need to send three matrices (projection, view, model) to the shader; my test object (a square, rather than a triangle) disappears.

My vertex shader looks like this:

attribute vec3 coord3d;

attribute vec3 v_color;

varying vec3 f_color;

uniform mat4 projection;

uniform mat4 model;

uniform mat4 view;

void main(void)

{

gl_Position = projection * view * model * vec4(coord3d, 1.0);

f_color = v_color;

}

And my fragment shader like this:

varying vec3 f_color;

uniform float fade = 0.1;

void main(void)

{

gl_FragColor = vec4(f_color.x, f_color.y, f_color.z, fade);

}

Note that I have also tried P*M*V, since sources conflict on which order is correct. Here is how I render my test square. m_Orientation is a quaternion that is used to store the orientation. Upon translation, I simply set the matrix' elements 12, 13 and 14 to the X, Y, and Z coordinates of the object's position, respectively.

void TestRect::Cycle()

{

m_Orientation.RotationMatrix(m_RotationMatrix);

glUniformMatrix4fv(ModelUniform,

1,

GL_FALSE,

m_RotationMatrix);

// Enable alpha

glEnable(GL_BLEND);

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glUniform1f(FadeUniform, 0.5);

glEnableVertexAttribArray(CoordinateAttribute);

glEnableVertexAttribArray(ColorAttribute);

glBindBuffer(GL_ARRAY_BUFFER, m_VertexBuffer);

glVertexAttribPointer(

CoordinateAttribute,

3,

GL_FLOAT,

GL_FALSE,

6 * sizeof(GLfloat),

0

);

glVertexAttribPointer(

ColorAttribute,

3,

GL_FLOAT,

GL_FALSE,

6 * sizeof(GLfloat),

(GLvoid*) (3 * sizeof(GLfloat))

);

glDrawArrays(GL_TRIANGLES, 0, 6);

glDisableVertexAttribArray(ColorAttribute);

glDisableVertexAttribArray(CoordinateAttribute);

glDisable(GL_BLEND);

glBindBuffer(GL_ARRAY_BUFFER, 0);

}

Now my previous system, which uses the apparently deprecated ftransform() as well as the definitely deprecated glEnableClientState(), works fine. I know I am loading and linking to the shaders properly, since I have done so previously in the same program.

Using only the model matrix, and not multiplying by projection or view, things work fine; I can rotate the square along any axis and translate it along the X and Y axes (Z-axis translations make it disappear, presumably because there is no projection calculated). So I can only assume the model matrix is correct. I update it every time the model is drawn, like this:

glUniformMatrix4fv(ModelUniform,

1,

GL_FALSE,

TransformMatrix);

The view matrix is derived from my camera class, which works fine in the old OpenGL format when I call glMultMatrixf(Camera.GetInvertedMatrix()). I assume that if using the matrix in glMultMatrix() works, then I should be using the same matrix as view matrix passed to the shader, right? In the new approach I update the uniform representing the view matrix once per frame, like this:

glUniformMatrix4fv(m_ViewUniform,

1,

GL_FALSE,

Camera.GetInvertedMatrix());

Finally, I used to use gluPerspective() for the projection, but since I am supposed to have my own matrices, I needed my own equivalent. I implemented code for a projection matrix like this:

void Game::SetPerspective(float FieldOfView, float Aspect, float zNear, float zFar, GLfloat* Matrix)

{

float xyMax = zNear * tan(FieldOfView * 0.5 * PI/180);

float yMin = -xyMax;

float xMin = -xyMax;

float Width = xyMax - xMin;

float Height = xyMax - xMin;

float Depth = zFar - zNear;

float q = -(zFar + zNear) / Depth;

float qn = -2 * (zFar * zNear) / Depth;

float w = 2 * zNear / Width;

w = w / Aspect;

float h = 2 * zNear / Height;

Matrix[0] = w;

Matrix[1] = 0;

Matrix[2] = 0;

Matrix[3] = 0;

Matrix[4] = 0;

Matrix[5] = h;

Matrix[6] = 0;

Matrix[7] = 0;

Matrix[8] = 0;

Matrix[9] = 0;

Matrix[10] = q;

Matrix[11] = -1;

Matrix[12] = 0;

Matrix[13] = 0;

Matrix[14] = qn;

Matrix[15] = 0;

}

And I call that code exactly once, during program setup, doing this:

SetPerspective(60.f, 1.33f, 0.1f, 512.f, m_Projection);

//Used to be gluPerspective(60.f, 1.33f, 0.1f, 512.f);

glUniformMatrix4fv(m_ProjectionUniform,

1,

GL_FALSE,

ProjectionMatrix);

Now somewhere I must be doing something wrong, because multiplying the projection and view matrices into the coordinates makes my square disappear entirely. But I can't figure what's wrong.

I have two, old-style shaders and matrix stacks also working in the background, so I am fairly sure I am not misreading attribute or uniform location.

So what might I be doing wrong?

My vertex shader looks like this:

attribute vec3 coord3d;

attribute vec3 v_color;

varying vec3 f_color;

uniform mat4 projection;

uniform mat4 model;

uniform mat4 view;

void main(void)

{

gl_Position = projection * view * model * vec4(coord3d, 1.0);

f_color = v_color;

}

And my fragment shader like this:

varying vec3 f_color;

uniform float fade = 0.1;

void main(void)

{

gl_FragColor = vec4(f_color.x, f_color.y, f_color.z, fade);

}

Note that I have also tried P*M*V, since sources conflict on which order is correct. Here is how I render my test square. m_Orientation is a quaternion that is used to store the orientation. Upon translation, I simply set the matrix' elements 12, 13 and 14 to the X, Y, and Z coordinates of the object's position, respectively.

void TestRect::Cycle()

{

m_Orientation.RotationMatrix(m_RotationMatrix);

glUniformMatrix4fv(ModelUniform,

1,

GL_FALSE,

m_RotationMatrix);

// Enable alpha

glEnable(GL_BLEND);

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glUniform1f(FadeUniform, 0.5);

glEnableVertexAttribArray(CoordinateAttribute);

glEnableVertexAttribArray(ColorAttribute);

glBindBuffer(GL_ARRAY_BUFFER, m_VertexBuffer);

glVertexAttribPointer(

CoordinateAttribute,

3,

GL_FLOAT,

GL_FALSE,

6 * sizeof(GLfloat),

0

);

glVertexAttribPointer(

ColorAttribute,

3,

GL_FLOAT,

GL_FALSE,

6 * sizeof(GLfloat),

(GLvoid*) (3 * sizeof(GLfloat))

);

glDrawArrays(GL_TRIANGLES, 0, 6);

glDisableVertexAttribArray(ColorAttribute);

glDisableVertexAttribArray(CoordinateAttribute);

glDisable(GL_BLEND);

glBindBuffer(GL_ARRAY_BUFFER, 0);

}

Now my previous system, which uses the apparently deprecated ftransform() as well as the definitely deprecated glEnableClientState(), works fine. I know I am loading and linking to the shaders properly, since I have done so previously in the same program.

Using only the model matrix, and not multiplying by projection or view, things work fine; I can rotate the square along any axis and translate it along the X and Y axes (Z-axis translations make it disappear, presumably because there is no projection calculated). So I can only assume the model matrix is correct. I update it every time the model is drawn, like this:

glUniformMatrix4fv(ModelUniform,

1,

GL_FALSE,

TransformMatrix);

The view matrix is derived from my camera class, which works fine in the old OpenGL format when I call glMultMatrixf(Camera.GetInvertedMatrix()). I assume that if using the matrix in glMultMatrix() works, then I should be using the same matrix as view matrix passed to the shader, right? In the new approach I update the uniform representing the view matrix once per frame, like this:

glUniformMatrix4fv(m_ViewUniform,

1,

GL_FALSE,

Camera.GetInvertedMatrix());

Finally, I used to use gluPerspective() for the projection, but since I am supposed to have my own matrices, I needed my own equivalent. I implemented code for a projection matrix like this:

void Game::SetPerspective(float FieldOfView, float Aspect, float zNear, float zFar, GLfloat* Matrix)

{

float xyMax = zNear * tan(FieldOfView * 0.5 * PI/180);

float yMin = -xyMax;

float xMin = -xyMax;

float Width = xyMax - xMin;

float Height = xyMax - xMin;

float Depth = zFar - zNear;

float q = -(zFar + zNear) / Depth;

float qn = -2 * (zFar * zNear) / Depth;

float w = 2 * zNear / Width;

w = w / Aspect;

float h = 2 * zNear / Height;

Matrix[0] = w;

Matrix[1] = 0;

Matrix[2] = 0;

Matrix[3] = 0;

Matrix[4] = 0;

Matrix[5] = h;

Matrix[6] = 0;

Matrix[7] = 0;

Matrix[8] = 0;

Matrix[9] = 0;

Matrix[10] = q;

Matrix[11] = -1;

Matrix[12] = 0;

Matrix[13] = 0;

Matrix[14] = qn;

Matrix[15] = 0;

}

And I call that code exactly once, during program setup, doing this:

SetPerspective(60.f, 1.33f, 0.1f, 512.f, m_Projection);

//Used to be gluPerspective(60.f, 1.33f, 0.1f, 512.f);

glUniformMatrix4fv(m_ProjectionUniform,

1,

GL_FALSE,

ProjectionMatrix);

Now somewhere I must be doing something wrong, because multiplying the projection and view matrices into the coordinates makes my square disappear entirely. But I can't figure what's wrong.

I have two, old-style shaders and matrix stacks also working in the background, so I am fairly sure I am not misreading attribute or uniform location.

So what might I be doing wrong?