PDA

View Full Version : Geometry Shaders in Linux?



Alex Sierra
11-03-2011, 01:07 PM
Hello,

I have been trying to run simple geometry shaders that works on MacOSX 10.6 (snow leopard) using GLEW, but in Linux, even when they compile and link successfully, doesn't work.

The versions we have in our Linux box are

server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
OpenGL version string: 4.1.0 NVIDIA 275.28
OpenGL shading language version string: 4.10 NVIDIA via Cg compiler

over a NVidia GeForce GTX460 card.

Any of you have run geometry shaders on linux?

I will appreciate any help.

-- Asierra.

Dark Photon
11-03-2011, 02:26 PM
I have been trying to run simple geometry shaders ... doesn't work.
...
OpenGL version string: 4.1.0 NVIDIA 275.28
...
over a NVidia GeForce GTX460 card.

Any of you have run geometry shaders on linux?
Yes. Works fine. Those drivers should be plenty recent. Need more info than "doesn't work". Are you checking for errors (glGetError())? Put one at the end of each frame and make sure that everything you're doing looks legit to the OpenGL driver.

Failing that, need more detail to help you than "doesn't work".

aqnuep
11-04-2011, 03:28 AM
Also, which version of geometry shader do you use?

- GL_ARB_geometry_shader4
- GL_EXT_geometry_shader4
- OpenGL 3.2 core geometry shaders

These are different from each other. Maybe you are using one flavor that is not supported by the Linux driver (though I think NVIDIA drivers usually support all of these well).

Alex Sierra
11-04-2011, 11:36 AM
The code display a triangle with a different color in each vertex. The GS should display a smaller, just orange, triangle.

As in macOSX we have a GeForce GT120 from 2009 and OpenGL 2.1, we use GLSL version 120.

These are the shaders:

Vertex


#version 120

varying float LightIntensity;

const vec3 LightPos = vec3( 0., 10., 10. );

void main()
{
vec3 normal = normalize( gl_NormalMatrix * gl_Normal );
gl_Position = gl_ModelViewMatrix * gl_Vertex;
LightIntensity = dot( normalize(LightPos - gl_Position.xyz), normal);
}


Geometry


#version 120
#extension GL_EXT_gpu_shader4: enable
#extension GL_EXT_geometry_shader4: enable

varying float LightIntensity;

//uniform
const float Shrink=0.5;

vec3 V[3];
vec3 CG;

void
ProduceVertex( int v )
{
LightIntensity = 0.8*LightIntensity;
gl_Position = gl_ProjectionMatrix * vec4( CG + Shrink * ( V[v] - CG ), 1. );
EmitVertex();
}

void main()
{
V[0] = gl_PositionIn[0].xyz;
V[1] = gl_PositionIn[1].xyz;
V[2] = gl_PositionIn[2].xyz;

CG = 0.33333 * ( V[0] + V[1] + V[2] );

ProduceVertex( 0 );
ProduceVertex( 1 );
ProduceVertex( 2 );
}


Fragment


#version 120

varying float LightIntensity;


void
main()
{
gl_FragColor = vec4( LightIntensity * vec3(1., 0.5, 0.), 1. );
}


To use it in the C++ program, I am using GLSLProgram class from Mike Baileys course, and GLEW. I have also tried including
just NVidia's glext.h.


Without parameters, when I activate the shader, nothing happens, not even a black screen, just the same multicolored triangle. In Mac sometimes it works, most times it display just a black image.

With these parameters, it display just a black screen.



glProgramParameteriEXT(shaderpg->program_id(), GL_GEOMETRY_INPUT_TYPE_EXT, GL_TRIANGLES);
glProgramParameteriEXT(shaderpg->program_id(), GL_GEOMETRY_OUTPUT_TYPE_EXT, GL_TRIANGLE_STRIP);
int temp;
glGetIntegerv(GL_MAX_GEOMETRY_OUTPUT_VERTICES_EXT, &temp);
printf("Max out vertices %d\n", temp);
glProgramParameteriEXT(shaderpg->program_id(), GL_GEOMETRY_VERTICES_OUT_EXT, temp);


Any idea?

bcthund
11-04-2011, 08:17 PM
I don't see any
EndPrimitive(); in your geometry shader, could this be your problem?

Also, I see your using version 120, and according to http://www.opengl.org/sdk/docs/manglsl/xhtml/EmitVertex.xml
EmitVertex(); isn't supported until version 150 as I understand it. Try using at least version 150, I use "#version 330 core" myself with an ATI and NVidia on Linux and can use geometry shaders on both.

aqnuep
11-05-2011, 01:26 AM
I don't see any
EndPrimitive(); in your geometry shader, could this be your problem?
It worths trying adding EndPrimitive(), but as far as I remember it is not necessary in case you don't want to emit multiple separate primitives.


Also, I see your using version 120, and according to http://www.opengl.org/sdk/docs/manglsl/xhtml/EmitVertex.xml
EmitVertex(); isn't supported until version 150 as I understand it. Try using at least version 150, I use "#version 330 core" myself with an ATI and NVidia on Linux and can use geometry shaders on both.
Don't forget that he's on MacOSX. OSX drivers don't support GLSL 1.50, only 1.20 and EmitVertex is available because he uses the GL_EXT_geometry_shader4 extension as specified by the #required directive.

bcthund
11-05-2011, 06:50 AM
Don't forget that he's on MacOSX. OSX drivers don't support GLSL 1.50, only 1.20 and EmitVertex is available because he uses the GL_EXT_geometry_shader4 extension as specified by the #required directive.

Ahh, I totally missed that it was on MacOSX, was concentrated on the shader code. I have no experience with programming Mac so can't help much.

I wasn't sure on the specifics of whether EndPrimitive(); was actually required, thought it might be the case that you could omit it in certain cases but I still thinks it's worth a try.