GLSL and smaller hardware configurations

Hi guys,

I wrote an OpenGL application that draws an image (as a texture) on top of a square, and a shader in GLSL to perform some custom lighting operations on that image (mainly for diffuse/specular computation), so you can see good specular highlights depending on how you move the image relatively to the defined lights. It does work well on my nVidia 6600 (Win32), but I though I’d try as well on my laptop’s GPU, which is a Radeon 9200 (OS X).

So I just compiled the code on the laptop and gave it a try, and to my surprise, not only did it compile without any mistakes, but it also did not crash when I started it. However, the result was not exactly what I expected. None of the complex lighting computation were shown in the laptop demo, it actually only did a basic texture mapping, whereby the image was mapped to the square. But there was absolutely no specular highlights present in the demo, as if the fragment shader was altogether ignored. I continued my test, and wrote the simplest fragment shader, i.e. “gl_FragColor.x = 0.5;” and similarly for y and z. But still nothing, the square was still displaying the image rather than being a uniform gray. It thus appears that the fragment shader is completely ignored, and the fixed functionnality is used instead.

I did not really expect my program to work with my laptop’s video card to be honest, but since it compiled successfully (both the CPU and GPU compiles generated no errors or warnings), I was expecting a different result. At least obtainining a warning of some sort to let me know that it will probably fail. By the way, the “GL_ARB_shading_language_100” and “GL_ARB_shader_objects” extensions are not supported by my laptop GPU. Is that a normal behaviour? Or do I have some other problems?

Many thanks

Alexis

9200’s do not support glslang. So I’m surprised your laptop didn’t crash at all. I guess the laptop driver decided to just ignore your glslang calls entirely.

Originally posted by Korval:
I guess the laptop driver decided to just ignore your glslang calls entirely.
Well, not quite as glCreateProgramObjectARB returns a “normal” value when called (1), and similarly for other functions. Or is that value just a dummy?

Actually for some weird reason Radeon 9200 under OS X does seem to have GL_ARB_shader_objects extension, but not GL_ARB_shading_language_100.
You can test it by downloading GLEW (OpenGL Extension Wrangler library) and run the glewinfo utility.

That’s why you get no errors when you call glCreateProgramObjectARB.

I don’t know whether this is a bug in the drivers as I couldn’t see any way to use ARB_shader_objects extension without the shading language by reading the specs.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.