PDA

View Full Version : Point sprite glsl help - ATI issue



Jeffg
10-27-2010, 09:56 AM
I'm trying to troubleshoot some issues between nVidia and ATI. nVidia is currently working fine, but my test ATI card (x1300) is having issues with point sprites. I've updated to the latest drivers... perhaps someone could help and take a look.

Windows / ATI card is rendering the colors wrong and the texture is upside down. Again, works fine on my MacBook (nVidia).

Vertex Shader

uniform float pointSize;
uniform float gui;
varying vec4 vFragColor;

void main(void) {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
vec4 modv = vec4( gl_ModelViewMatrix * gl_Vertex );
if(gui == 0.0) {
gl_PointSize = 200.0 * pointSize/ -modv.z ;
} else {
gl_PointSize = pointSize;
}
float fog = 8000.0/ -modv.z;
if (fog < 8.0 &amp;&amp; gui == 0.0) {
vFragColor = smoothstep(0.0, 8.0, fog) * gl_Color;
} else {
vFragColor = gl_Color;
}
gl_FrontColor = gl_Color;
gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;
}

Fragment Shader

uniform sampler2D tex;
varying vec4 vFragColor;

void main() {
vec4 color = texture2D(tex,gl_TexCoord[0].st);
gl_FragColor = color * vFragColor;
}


Processing Code

try {
ringImg = TextureIO.newTexture(new File(dataPath("images" + File.separator + "ring2.png")), true);
}
catch (IOException e) {
showMessage(e);
}
...
pshader.startShader();
gl.glUniform1f(pshader.getUniformLocation("gui"), 0.0);
gl.glDisable(GL.GL_POINT_SMOOTH);

gl.glEnable(GL.GL_POINT_SPRITE_ARB);

gl.glEnableClientState(GL.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL.GL_COLOR_ARRAY);

gl.glEnable(GL.GL_VERTEX_PROGRAM_POINT_SIZE_ARB);
gl.glPointParameterfARB( GL.GL_POINT_SIZE_MIN_ARB, 0.0 );

gl.glTexEnvi(GL.GL_POINT_SPRITE_ARB, GL.GL_COORD_REPLACE_ARB, GL.GL_TRUE);

//Rings around icons
gl.glBlendFunc(GL.GL_ONE,GL.GL_ONE_MINUS_SRC_ALPHA );
ringImg.enable();
ringImg.bind();
gl.glUniform1f(pshader.getUniformLocation("pointSize"), 114.0);

gl.glBindBuffer( GL.GL_ARRAY_BUFFER, nodes_vbo[0]);
gl.glVertexPointer(3,GL.GL_FLOAT,0,0);
gl.glBindBuffer( GL.GL_ARRAY_BUFFER, nodes_vbo[1] );
gl.glColorPointer(3,GL.GL_FLOAT,0,0);
gl.glBindBuffer( GL.GL_ELEMENT_ARRAY_BUFFER, nelements_vbo[0] );
gl.glDrawElements( GL.GL_POINTS, vnodelength, GL.GL_UNSIGNED_INT, 0 );
gl.glDisableClientState(GL.GL_COLOR_ARRAY);
ringImg.disable();
gl.glDisable(GL.GL_POINT_SPRITE_ARB);
gl.glUniform1f(pshader.getUniformLocation("gui"), 1.0);
pshader.endShader();
...

Pierre Boudier
11-02-2010, 03:45 AM
can you run your experiment on a recent ATI card ? (>=hd2xxx)

(the drivers for x1300 are no longer being updated regularly)

thanks,

Jeffg
11-02-2010, 02:33 PM
Wish I could, but people here have that card. I'm also having a competing issue on a VM.

I can't get the color to work correctly.

This works on all systems (minus color of course):
gl_FragColor = texture2D(tex,gl_TexCoord[0].st);

This works differently on the systems.
gl_FragColor = texture2D(tex,gl_TexCoord[0].st) * gl_Color;

As does this even different from above...
gl_FragColor = texture2D(tex,gl_TexCoord[0].st) * FragColor; //self defined varying

???? Help... Seems I'm getting different values depending on if it goes through the fragment interpolation.

I've tried:
gl.glTexEnvi( GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_BLEND );
gl.glTexEnvi( GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_REPLACE );
gl.glTexEnvi( GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_COMBINE );

I've also tried gl_PointCoord, which doesn't seem to work at all on any system.

What am I doing wrong...?