I have the following GLSL shader that is working on a Mac computer with an NVIDIA GeForce GT 330M, a different Mac computer with an ATI Radeon HD 5750, an Ubuntu VM inside this second Mac, but not on an Ubuntu VM inside a Windows machine with a GeForce GTX 780 (all drivers up to date). The shader is pretty basic, so I’m looking for some help what might be wrong! The vertex shader looks like (I’m using the cocos2d-x game engine, which is where all of the CC_{x} variables are defined):
varying vec4 v_fragmentColor;
void main() {
gl_FragColor = texture2D(CC_Texture0, gl_PointCoord) * v_fragmentColor; // Can't see anything
// gl_FragColor = texture2D(CC_Texture0, gl_PointCoord); // Produces the texture as expected, no problems!
// gl_FragColor = v_fragmentColor; // Produces a white box as expected, no problems!
}
As you can see, I’m getting very strange behavior where both the sampler, CC_Texture0, and the varying vec4, v_fragmentColor, seem to be working properly, but multiplying them causes problems. I’m reasonably confident everything else is set up right because I’m seeing it work properly on the other systems, so it seems to be related to the graphics card or some undefined behavior that I’m not aware of? Also, I’m using #version 120 ( which was needed for gl_PointCoord). Thanks for any help!
[QUOTE=parrotintheoven;1280045]I’m using the cocos2d-x game engine, which is where all of the CC_{x} variables are defined):
[/QUOTE]
So cocos2d-x is pre-processing your shaders to add the declarations for its own variables?
Have you tried retrieving the shader source code generated by cocos2d-x, e.g. using glGetAttachedShaders() and glGetShaderSource()?
[QUOTE=GClements;1280047]So cocos2d-x is pre-processing your shaders to add the declarations for its own variables?
Have you tried retrieving the shader source code generated by cocos2d-x, e.g. using glGetAttachedShaders() and glGetShaderSource()?[/QUOTE]
Yeah, I can get get the entire shader source, I just thought it might be easier if it was just the part that seemed suspect. Here is the entire vertex shader if it helps:
What happened to me once was that I did something wrong (illegal arguments to some OpenGL function) and wasnt checking for errors consistently and different drivers / GPUs reacted differently to the illegal argument. Some systems simply ignored the error and did what I was expecting them to do. Some other systems were stricted and generated an error and ignored the command. Others generated an error but still did something unpredictable with the garbage data. Are you sure you are always checking for errors, both the OpenGL function invocations AND the shader programs?
I never had the problem but I heard that some drivers have problems when you use 1 instead of 1.0 - but it is hearsay it should be allowed since version 120.
You’re not using the GLES path in your shaders, right? Because the #version declarations would be a problem if you were. That said, I doubt this is related to your problem.
Just so we’re on the same page, you mean including the GLES header versus just the regular GL header right? Yes, I think I’m doing that correctly. In fact if I remove the #version 120 line I get a compilation error that gl_PointCoord is not defined.
EDIT: Sorry, now I understand what you were saying. I’m pretty sure I’m not going into the GLES path, but just to make sure, I deleted those lines and nothing changed.