Simple OpenGL app has no shader support? Mac laptop, GF 9400M 256MB

I have this app running on both my Windows and Linux boxes just fine. I’m trying this simple/stupid shader that sets the color of the “quads” (triangle strips with two triangles), and it appears my mac quietly decides to go fixed-function or something. Everything is black or white. Here’s the shader:

const char* default_vertex_shader =
	"#version 330

"
“layout(location = 0) in vec3 vertexPosition;”
“uniform mat4 mvpMatrix;”
“in vec3 vp;”
“void main () {”
" gl_Position = mvpMatrix * vec4 (vp, 1.0);"
“}”;

const char* default_fragment_shader =
	"#version 330

"
“uniform vec4 color;”
“out vec4 outputColor;”
“void main () {”
" outputColor = color;"
// " outputColor = vec4(color.r, 0.25, 0, 1);"
“}”;

All of the uniform locations are -1 when I check them. Mind you, this works on Linux and Windows just fine… but not on my mac. Now, something to be said, I’m in the middle of porting a lot of old OpenGL 2 code to OpenGL 3+, so I’ve not set a core profile. Might this be the cause?

Any insight would be helpful.

Thank you

Rob

It’s more likely the case that you either didn’t post your actual shader code or your code doesn’t work on any platform.

I say that because your shader has a suspicious lack of "
", outside of the first one. Now technically you don’t need them in most cases, but since your fragment shader uses “//” comments…

Yes. You could query GL_SHADING_LANGUAGE_VERSION or the shader log to figure this out.

Yes, you need a core profile to use GLSL 330 on OSX.

[QUOTE=Alfonse Reinheart;1264220]It’s more likely the case that you either didn’t post your actual shader code or your code doesn’t work on any platform.

I say that because your shader has a suspicious lack of "
", outside of the first one. Now technically you don’t need them in most cases, but since your fragment shader uses “//” comments…[/QUOTE]

The // is outside the quotes, so they’re not really in the shader. Note each line is separately quoted.

That is my actual shader code; I’m working on upgrading an old in-game windowing system which used the fixed-function pipeline to produce simple user interface… in the future I intend to make it more fancy, but for now, a flat color windowing system is my target.

Well, I tried using a core profile and whatnot in my application, and now GLFW doesn’t open the window; returning NULL for the context (on Mac and Linux). So I deferred to a simpler approach, rather than work with my application which still intermingles a lot of legacy/deprecated OpenGL (working on upgrading…) I decided to attack it from a much simpler angle, and try the single triangle tutorial (tutorial #2) from opengl-tutorial.org. I got the same results; it refused to open the window and present anything but a NULL pointer. I am wondering if this may be because:

  1. I am building this with g++ rather than OS/X’s Clang compiler (why would that make any difference?)
  2. My video card + OS/X Maverick is no longer supported (doubt it)
  3. I’m not linking in the right library? (I see a libGL.dylib and libGL.1.dylib… nothing else)

Any help would be appreciated.

Thank you
Rob

After reviewing the code in the tutorial, I found that this (required by mac) line was missing:

glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); // To make MacOS happy; should not be needed

And once added, the triangle tutorial was happy.

So this brings up the question… If I link deprecated OpenGL code into my application and I try to use a 3.2+ profile, does the system recognize this and refuse to open a context? It appears to be so… on both Mac and Linux. I’m asking because hopefully there’s somebody with more insight into this than I.

I had been programming OpenGL for years, but my work (microcontrollers and Linux kernels; no graphics) took me away promptly after GLSL appeared, which was before OpenGL 3 ever saw the light of day. Now I’m getting back into it and … whoa. What a change.

Learn about profiles. See also platform documentation about profiles.
TLDR: On OSX, GL3.2 and later contexts are always core and forward-compatible; all deprecated features are removed.

If you look at glfw you can see how it enforces OSX’s requirement for forward-compatible contexts.

If I link deprecated OpenGL code into my application and I try to use a 3.2+ profile, does the system recognize this and refuse to open a context?

Context creation is one thing, linking API is another, calling API is another.

[QUOTE=robstoddard;1264219]All of the uniform locations are -1 when I check them. Mind you, this works on Linux and Windows just fine… but not on my mac. Now, something to be said, I’m in the middle of porting a lot of old OpenGL 2 code to OpenGL 3+, so I’ve not set a core profile. Might this be the cause?
[/QUOTE]
IIRC, MacOSX offers a choice of 2.1 compatibility profile or 3.x core profile. Unlike Windows or Linux, there is no 3.x compatibility profile, so if you use a mix of 3.x features and legacy features, something is going to fail.

As a start, I’d suggest checking that the shaders compiled successfully, the program linked successfully, and that glGetError() isn’t reporting any errors.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.