PDA

View Full Version : Problem with trivial shader on Radeon



Bojan P.
11-26-2010, 01:53 AM
I'm having a problem with writing a shader. I first wrote this one:


#version 120
attribute vec3 position;
attribute vec3 normal;
uniform mat4 matWorldViewProjection;
void main()
{
gl_Position = matWorldViewProjection * vec4(position, 1.0);
}


And it worked. Than I wanted to start adding other stuff, but immediately ran into problem. This one doesn't work:


#version 120
attribute vec3 position;
attribute vec3 normal;
uniform mat4 matWorldViewProjection;
varying vec3 n;
void main()
{
gl_Position = matWorldViewProjection * vec4(position, 1.0);
n = normal;
}


It gives no error on compilation and linking, but nothing gets rendered.
I narrowed down the problem to vertex shader with my app and then switched to Render Monkey to work on this specific problem. It all started yesterday when I got Radeon 2600XT instead of Radeon X550 and installed new drivers - Catalyst 10.11. I had a working diffuse shader with texturing, but it stopped working with a new card/driver and that's why I started writing it from scratch - to see what's really happening there.

Fragment shader is trivial too:


#version 120
void main()
{
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Any ideas? I have run out of them.

ZbuffeR
11-26-2010, 02:22 AM
Does it works if you keep gl_Position at the last line ? Something like :


#version 120
attribute vec3 position;
attribute vec3 normal;
uniform mat4 matWorldViewProjection;
varying vec3 n;
void main()
{
n = normal;
gl_Position = matWorldViewProjection * vec4(position, 1.0);
}

Bojan P.
11-26-2010, 02:38 AM
Thanks for the interest. I did try that, but unfortunately it doesn't change anything. Another thing I tried, as someone from gamedev forum suggested, is to add 'varying vec3 v;' to the fragment shader.

Shader does work when I use ftransform(). But I need more flexibility than that.

skynet
11-26-2010, 06:55 AM
...Shader does work when I use ftransform()...

I _bet_ you did not setup your 'position' attribute properly. Did you set the attribute location to 0 (glBindAttributeLocation), did a re-link of the shader, then use glVertexAttribPointer(0,....) and glEnableVertexAttribArray(0) ?

Bojan P.
11-26-2010, 08:33 AM
I checked the glGetError after glBindAttribLocation and got GL_INVALID_VALUE. Why would 0 be an invalid value?

Not sure how Render Monkey implements setup of attributes, but I did it this way:

-initialize buffer for vertices
-compile shaders
-glBindAttribLocation(program, 0, "position");
-link shader program
-bind buffer
-glEnableVertexAttribArray(0);
-glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0));
-glDrawElements

Is that a right sequence? Why did you mention relinking of the program? Is it really necessary to relink?

Bojan P.
11-26-2010, 02:41 PM
I've just tried it on nvidia's GeForce 310M and it works just fine.

Bojan P.
11-28-2010, 07:58 AM
And yes sir, you ware right. I was binding the attribute location before calling glCreateProgram.

Now I just need figure out how I managed to replicate the problem in Render Monkey.