View Full Version : GLSL shader problem in Panda3D

04-06-2012, 10:09 AM
Hi there,

I have a question regarding the use of shaders in the Panda3D game engine, using python. I've posted on the panda forums as well, but i just thought i'd try my luck here as well.

My system is currently equipped with an ATI graphics card.
As i'm working on a rts project that would require multiple hundreds of models moving around, i've started a looking into geometry instancing, which for my gfx card requires the use of GLSL shaders (i've received a lot of useful help on a seperate thread )

However, i seem to be unable to get this type of shaders to work with Panda3D.
When i implement even the simplest of GLSL shader, anything this shader is used on simple isn't rendered in Panda3D.
The console however reports a succesfull compile of the shader.

I've tried 2 different ATI Radeon HD cards, a fresh install of W7 64bit, the latest catalyst drivers (12.3), fresh install of Panda1.8.0, yet that didn't solve the problem.

GPU caps viewer tells me i have OpenGL 4.2 running & the GLSL tests from that program do work on my system.

Below are the shaders i've currently applied (i've used the solar system sample program & added these shaders to the sun model in step 3):

vertex shader

#version 140

void main() {
gl_Position = gl_ProjectionMatrix * gl_ModelViewMatrix * gl_Vertex;

fragment shader

#version 150
out vec4 MyFragColor;
void main(void)
MyFragColor = vec4(0.4, 0.4, 0.8, 1.0);

Does anyone have any pointers for me?

04-07-2012, 07:51 AM
you might try

#version 400
layout(location = 0, index = 0) out vec4 MyFragColor;

void main(void)
MyFragColor = vec4(0.4, 0.4, 0.8, 1.0);

I am not sure version 150 supported named outputs