View Full Version : GLSL Issue on ATI Card

07-01-2008, 09:23 AM
I have written a fragment shader that takes a texture as an input does a little math and outputs values into an FBO. Everything works fine on an nVidia G80, but goes wildly wrong on an ATI 2400 XT. Here is the code:

/************************************************** ***********************/

uniform sampler2DRect knotPoints;
#define kp(index) (texture2DRect(knotPoints, vec2(index)).r)
uniform sampler2DRect verts;

//Uniform Inputs
uniform ivec4 numParams; // { degree, cp, kp, foo }

int FindSpan(float u) {
//Check the special case
if (u >= kp(numParams.y)) return numParams.y-1;

//Set up search
ivec3 lmh = ivec3(numParams.x, 0, numParams.y);
lmh.y = (lmh.x + lmh.z) / 2;
//Return the span value
return lmh.y;

void main(void) {
int i, j, span, index;

//Get basic vertex info
vec4 inVert = texture2DRect(verts, floor(gl_FragCoord.xy));
//Find the span for the vertex
span = FindSpan(inVert.x);
//Return the values
gl_FragColor = vec4(inVert.x, span, numParams.x, numParams.y);

/************************************************** ***********************/

As you can see, it is pretty simple. This issue is that when I run it on the ATI card (on XP) the divide by 2 in FindSpan returns a very wrong answer (like 1e-10). If I remove the divide it correctly adds the two values.

Am I doing something wrong or are there known issues with this card?


07-01-2008, 10:47 AM
It is possible that the driver does not correctly handle the use of integer types in your calculation. Try to use ordinary floats combined with the the floor function to emulate the integers.