I have read through some of the forums. But i think the best thing would be to just post my code for you to see what the problems are. I have basically followed the Lighthouse3D code to load my shader program, but when i say glGetUniformLocation, then it gives back a result of -1. (which is bad). i am sure i am just missing one thing, because it all looks correct.
please see my code and help me. i am both new to these forums and to OpenGL and GLSL coding. Thank you for any help.
//Load Vertex Shader
//Create Shader Handler
vertex_Handle=glCreateShader(GL_VERTEX_SHADER);
//Load Code from file
char *VCtempcode=textFileRead("find_Coef.vert");
const char *VSpointer=VCtempcode;
//Init Shader Program
glShaderSource(vertex_Handle,1,&VSpointer,NULL);
//free Temp Code
free(VCtempcode);
//Compile Shader
glCompileShader(vertex_Handle);
//Load Fragment Shader
//Create Shader Handler
fragment_Handle=glCreateShader(GL_FRAGMENT_SHADER);
//Load Code From File
char *FCtempcode= textFileRead("find_Coef.frag");
const char *FSpointer=FCtempcode;
//Init Shader Program
glShaderSource(fragment_Handle,1,&FSpointer,NULL);
//free Temp Code
free(FCtempcode);
//Compile Shader
glCompileShader(fragment_Handle);
//Bind Together
//Create Program Handler
shader_Program=glCreateProgram();
//Attach shaders to program
glAttachShader(shader_Program,vertex_Handle);
glAttachShader(shader_Program,fragment_Handle);
//Link Program (NB Shaders need to be compiled)
glLinkProgram(shader_Program);
//Get Pointers to Shader Variables
glGetUniformLocation(ShaderVariableInput_Image,"input_A");
my fragment shader is below:
uniform sampler2D input_A;
varying vec2 texCoord;
void main(void)
{
//get the values on the main diagonal
float Value=texture2D(input_A, 1.0-vec2(1.0-texCoord.y,texCoord.y)).x;
//Colour the fragment with Value
gl_FragColor = vec4(Value);
}
Well, first are you checking that the shader has linked and compiled properly?
Second is that I’m pretty sure you need to ‘use’ the program before you can get it’s uniforms.
Third is that you give glGetUniformLocation the program object, in your case it should be:
ShaderVariableInput_Image = glGetUniformLocation(shader_Program,“input_A”);
if (ShaderVariableInput_Image != -1)
etc.
I think since I’m not sure what you’re following (you do have access to API docs, right?) it’s worth noting that your next step is to bind the sampler to a texture unit, not a texture object. My code looks like this:
uniformLoc = glGetUniformLocation (progObj, “imgTex”);
if (uniformLoc != -1)
glUniform1i (uniformLoc, 0);
And then I bind my texture object to texture unit 0.
I highly recommend the SuperBible examples, if you’re looking to use cut and paste to leapfrog. Worked for me ; )
Bruce
Cool.
The shader does indeed compile (i use RenderMonkey for that).
I havent used the program glUseProgram before i get the location of the uniform. but i will do that. i did actually change the code to
ShaderVariableInput_Image = glGetUniformLocation(shader_Program,“input_A”); but still ran into the same problems. i just forgot to add it here.
i tried it and it still didnt work. placed glGetUniformLocation right after glUseProgram(shader_Program).
Your shader is pretty simple and should compile. The link stage is the most difficult part and is a source of problems. Check the link state like they say here on Lighthouse3D.
dletozeun, i used the tutorial as is.
i copied it below:
void printShaderInfoLog(GLuint obj)
{
int infologLength = 0;
int charsWritten = 0;
char *infoLog;
glGetShaderiv(obj, GL_INFO_LOG_LENGTH,&infologLength);
if (infologLength > 0)
{
infoLog = (char *)malloc(infologLength);
glGetShaderInfoLog(obj, infologLength, &charsWritten, infoLog);
printf("%s
",infoLog);
free(infoLog);
}
}
void printProgramInfoLog(GLuint obj)
{
int infologLength = 0;
int charsWritten = 0;
char *infoLog;
glGetProgramiv(obj, GL_INFO_LOG_LENGTH,&infologLength);
if (infologLength > 0)
{
infoLog = (char *)malloc(infologLength);
glGetProgramInfoLog(obj, infologLength, &charsWritten, infoLog);
printf("%s
",infoLog);
free(infoLog);
}
}
for the obj variable, i placed my shader_Program. but, it doesnt show that there are any errors… as it doesnt print anything out on the screen. i placed an else statement in that basically says that if infoLogLength == 0, then print “nothing wrong”.
is this right?
i still dont really know what is wrong.
im using an X1600 ATI card on my MBP. i am programming on my Windows partition. the program works on my friend’s computer, who has a nVidia 8800.
Actually you don’t need to use the program to get uniforms or atributes location, it just need to be successfully linked (since uniforms or attributes location won’t change unless the program is re-linked).
But this is not the problem here and your code looks valid to me. The problem may be somewhere else.
glGetUniformLocation returns -1 if the specified uniform name does not exist in program or if the program is not successfully linked. But in your code, the uniform name is correct and the program seems successfully linked.
The only thing I see that would cause this problem is a wrong program handle given to glGetUniformLocation.
I also suggest you to check if opengl does not throw any error, in this case try to simplify your code the more you can to isolate the error cause and then correct it. This might be linked to your problem with shaders.
after trying different things, i ported it over to my OS X using the GLUT API. it has successfully linked. i wasnt able to use LUMINANCE though, so i think that could have been the problem. So now i just use RGB.
Thank you for all the help. it seems like the problem could’ve just been there.
Sweet!