View Full Version : GL_EXT_geometry_shader4 and ATI/OS X

04-16-2013, 01:00 AM
I'm trying to get a fairly straightforward geometry shader to work in OS X (10.8.3), but I've run into a very strange problem that is causing artifacts on screen. This same program works just fine using a Windows computer with an Nvidia graphics card. Here is a video of the problem, where you can see random triangles appearing and disappearing.


I have narrowed down the problem to transferring vertex attributes between the geometry and fragment shader. It doesn't matter if I use built-in attributes (gl_Vertex, gl_Normal) or generic vertex attributes.

// VERTEX SHADER----------------------------------------------------------------
#version 120
uniform mat4 modelView;
uniform mat4 projection;
varying vec3 vert_normal_eye;
void main()
vert_normal_eye = (modelView * vec4(gl_Normal, 0.0)).xyz;
gl_Position = projection * modelView * gl_Vertex;

// GEOMETRY SHADER ------------------------------------------------------------
#version 120
#extension GL_EXT_gpu_shader4 : enable
#extension GL_EXT_geometry_shader4 : enable
varying in vec3 vert_normal_eye[3];
varying out vec3 normal_eye;
void main()
for (int i = 0; i < 3; i++) {
gl_Position = gl_PositionIn[i];
normal_eye = vert_normal_eye[i];

// FRAGMENT SHADER ------------------------------------------------------------
#version 120
const vec3 light = vec3(0.0, 0.0, -1.0);
varying vec3 normal_eye;
void main()
vec3 n = normalize(normal_eye);
float diffuse = max(max(dot(n, light), dot(-n, light)), 0.35);
gl_FragColor = vec4(diffuse, diffuse, diffuse, 0.2);

If I use gl_FragColor = vec4(1.0, 1.0, 1.0, 0.2) in the fragment shader and remove all lines containing normal_eye the problem goes away. Note that I don't really need a geometry shader for this example, but it's only to illustrate the problem in the smallest example I can provide.

04-16-2013, 02:43 AM

I don't know how well the legacy OpenGL with geometry shader extensions is supported on Macs (the extensions you mentioned should be supported: https://developer.apple.com/graphicsimaging/opengl/capabilities/GLInfo_1083.html but how well tested that is is another question). Is porting your code to OpenGL 3.2 core an option?

04-16-2013, 02:40 PM
Hi menzel,

I would have to think about porting to 3.2 core. I'm using some deprecated things like display lists with immediate mode in some parts. Is there a place I can submit a bug report (not that Apple is likely to care)?


04-16-2013, 08:12 PM

...where you can attach your system profile, so there is something more concrete to reproduce with than "ATI", which is any of a dozen+ different GPUs.

04-17-2013, 12:37 PM
Have you tried adding EndPrimitive() to the end of the geometry shader? Also, are you setting the maximum number of output vertices to at least 3 via glProgramParameteri()?

04-18-2013, 03:17 AM
According to the specs, "it is not necessary to call EndPrimitive() if the geometry shader writes only a single primitive." I did try this of course, but no change. I am also settings the propert states using glProgramParameteri(). I put together a very example that reproduces this bug for an Apple bug report: web.cs.miami.edu/home/jstoecker/files/geometry_shader_bug.zip. It should compile trivially and only uses OpenGL/GLUT calls. I get a flickering triangle in the center of the teapot on my machine (Radeon 6750M / OS X 10.8.3).

08-05-2013, 03:19 PM
Hi jstoecker ,
did you manage to make any progress with this. I'm having the same problem, my shaders are in GLSL 150 with 3.2 core profile so I'm not sure its a version issue.