PDA

View Full Version : OpenGL Shading Problem



winEr
02-21-2011, 03:16 AM
Hi,

For a project I'm working on I to work with an OpenGL engine someone made himself. We tried the engine on different laptops and everywhere it works normal. However due to some unknown cause I can't see models on my screen. I believe the problem lies within the vertex shaders we use. Still I have little to no knowledge of shaders so I can't be 100% sure about it.

I have an ATI Mobility Radeon HD 5600/5700 series graphics card with the latest drivers. Driver date is 2011-01-26. Version is 8.821.0.0

I added a screenshot how it should look like and a screenshot
how it's displayed on my screen.

And this are the shaders we use:

vertex.glsl

#version 140

precision highp float;

uniform mat4 camera_matrix;
uniform mat4 model_matrix;
uniform mat4 projection_matrix;

uniform vec3 model_scale;

in vec3 in_Vertex;
in vec3 in_Normal;
in vec2 in_UV;

out vec3 out_Vertex;
out vec3 out_Normal;
out vec2 out_UV;

void main()
{
vec4 pos=model_matrix*vec4(in_Vertex.x*model_scale.x,in _Vertex.y*model_scale.y,in_Vertex.z*model_scale.z, 1.0);

out_Vertex=pos.xyz;
out_Normal=mat3(model_matrix)*in_Normal;
out_UV=in_UV;

gl_Position=projection_matrix*mat4(mat3(camera_mat rix))*(pos+vec4(camera_matrix[3][0],camera_matrix[3][1],camera_matrix[3][2],0));
}

fragment.glsl

#version 140

precision highp float;

uniform struct
{
int m_Active;
vec3 m_Ambient;
vec3 m_Diffuse;
vec3 m_Specular;
float m_Shine;
} Material;

uniform sampler2D texture0;

in vec3 out_Vertex;
in vec3 out_Normal;
in vec2 out_UV;

void main(void)
{
if(Material.m_Active==1)
{
gl_FragColor=texture2D(texture0,out_UV.st);
}
else
{
gl_FragColor=vec4(Material.m_Diffuse,1.0);
}

//gl_FragColor=vec4(out_Vertex.x*0.2+0.5,out_Vertex. y*0.2+0.5,out_Vertex.z*0.2+0.5,1.0);
}

Anyone who knows what the cause could be? Because I'm really stuck if I can't fix this :(

mobeen
02-21-2011, 04:41 AM
The way you are calculating the clipspace position seems wrong. Instead of this


gl_Position=projection_matrix*mat4(mat3(camera_mat rix))*(pos+vec4(camera_matrix[3][0],camera_matrix[3][1],camera_matrix[3][2],0));

do this,


gl_Position=(projection_matrix*camera_matrix)*pos;

winEr
02-21-2011, 05:02 AM
That does not work.
I used the other way exactly for that reason.
If I do it your way the rotation and translation are in the wrong order.

Also, this could in no way cause the deformed models.

mobeen
02-21-2011, 05:34 AM
Two things:
1) Check your model, view and projection matrices.
2) Check the incoming vertex positions. How are u pushing htem in using the immediate mode or vertex arrays/ vbo? Post the code snippet of where u r putting the vertices in,

winEr
02-21-2011, 05:52 AM
As you can see on the first example screenshot, the models and matrices are correct. I also checked the incoming vertex positions using gDebugger, and the values were correct. They were nothing like the planes that get rendered.

This is how I load them:

glGenBuffers(m_PI->m_SubmeshList.size(),m_PI->m_VertexBuffer._Myfirst);
glGenBuffers(m_PI->m_SubmeshList.size(),m_PI->m_NormalBuffer._Myfirst);
glGenBuffers(m_PI->m_SubmeshList.size(),m_PI->m_UVBuffer._Myfirst);

for(unsigned int i=0;i<m_PI->m_SubmeshList.size();i++)
{
glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_VertexBuffer[i]);
glBufferData(GL_ARRAY_BUFFER,m_PI->m_SubmeshList[i]->m_Vertex.size()*sizeof(vector3f),m_PI->m_SubmeshList[i]->m_Vertex._Myfirst,GL_STATIC_DRAW);

glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_NormalBuffer[i]);
glBufferData(GL_ARRAY_BUFFER,m_PI->m_SubmeshList[i]->m_Normal.size()*sizeof(vector3f),m_PI->m_SubmeshList[i]->m_Normal._Myfirst,GL_STATIC_DRAW);

glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_UVBuffer[i]);
glBufferData(GL_ARRAY_BUFFER,m_PI->m_SubmeshList[i]->m_UV.size()*sizeof(vector2f),m_PI->m_SubmeshList[i]->m_UV._Myfirst,GL_STATIC_DRAW);

glBindBuffer(GL_ARRAY_BUFFER,NULL);
}

This is how I render them:

glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glEnableVertexAttribArray(2);

glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_VertexBuffer[i]);
glVertexAttribPointer(0,3,GL_FLOAT,GL_FALSE,0,0);

glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_NormalBuffer[i]);
glVertexAttribPointer(1,3,GL_FLOAT,GL_FALSE,0,0);

glBindBuffer(GL_ARRAY_BUFFER,m_PI->m_UVBuffer[i]);
glVertexAttribPointer(2,2,GL_FLOAT,GL_FALSE,0,0);

glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_TRIANGLES,0,m_PI->m_SubmeshList[i]->m_Vertex.size());
OpenGLError();
glDisableClientState(GL_VERTEX_ARRAY);

glDisableVertexAttribArray(2);
glDisableVertexAttribArray(1);
glDisableVertexAttribArray(0);

mobeen
02-21-2011, 06:53 AM
Where is the first image taken on I mean which hardware?
Is the opengl state machine fine? You can do so by adding an assertion like this at the start of the render function.
void render() {
assert(glGetError()==GL_NO_ERROR);
...
}
To check further create a minimal nglut project and render a teapot and see if the problem still exists.

winEr
02-23-2011, 05:57 AM
The problem is solved. It was due to the fact we linked the shaders before we binded the attributes. Turned this around and everything showed on screen as it should have been :)

malexander
02-23-2011, 06:40 AM
I highly recommend implementing a simple transform feedback debug class (or set of functions) so that you can see the output of vertex or geometry shaders. They're invaluable when you've spent hours coding up a new shader, everything compiles and no errors are thrown, and then nothing appears. At least the position output can give you a clue.