PDA

View Full Version : OpenGL 3.2 in OSX 10.7



malexander
02-24-2011, 09:08 AM
According to macrumors (http://www.macrumors.com/2011/02/24/apple-releases-first-developer-preview-of-mac-os-x-lion/), OSX 10.7 will have OpenGL 3.2 support, due out sometime this summer. Fingers crossed for a stable, well performing implementation.

Groovounet
02-24-2011, 09:25 AM
WHouhouuuuu! :)

Alfonse Reinheart
02-24-2011, 11:24 PM
It's odd that they would pick 3.2 instead of 3.3, when all 3.2 capable hardware can also support 3.3.

What's going to be interesting is what extensions they also support in addition to 3.2. The differences between 3.2 and 3.3 are many; explicit_attrib_locations alone is a pretty big deal for how you write shaders.

Groovounet
02-25-2011, 05:31 AM
Apple is writing there own drivers and I guess that's they will have time to finalize until Lion release. I don't think it's a choice.

If they had choice, it would be OpenGL 4.1

bugmenot
02-26-2011, 08:21 AM
Not sure where macrumors is getting its information from but actual user reports don't corroborate their story.

Lion OpenGL Support using NVIDIA 9400M 256MB:
https://skitch.com/fredericl/rt7mh/opengl-extensions-viewer

Lion OpenGL Support using NVIDIA 320M:
http://dl.dropbox.com/u/7755/glview-NVIDIA%20GeForce%20320M%20OpenGL%20Engine.xml

Lion OpenGL Support using GeForce 8800 GT:
http://twitter.com/opengl/status/40904330320027648

Lion OpenGL Support using Intel X3100:
http://netkas.org/?p=609#comment-152379

All the above nVidia GPU's support OpenGL 3.3 on Win/Lin.
And yet clearly they are still stuck with GLSL 1.2 on Lion.

So there doesn't seem to be complete OpenGL 3.2 or 3.1 or 3.0 support in Lion.
Based on the screenshot in the first URL, it looks like OpenGL support is identical to that in Snow Leopard because that is what I am seeing on my 9400M Snow Leopard Mac.

JoshKlint
02-26-2011, 11:47 AM
I think it's really pathetic. With the stability of Mac drivers they could take over high-end rendering. Although the hardware is capable of doing much more, our engine is going to use a renderer about on a level with Half-Life 2 on OSX. I don't trust OpenGL 2.1 + extensions to write a deferred renderer, especially when I know the Apple engineers are almost certainly not designing their branched version of OpenGL with anything advanced in mind.

If they want Mac to be the creative / artist computer, they should support modern rendering functionality!

Alfonse Reinheart
02-26-2011, 02:59 PM
Not sure where macrumors is getting its information from but actual user reports don't corroborate their story.

It's possible that Apple added new functions to the Apple-specific MacOSX OpenGL interface to create GL 3.x contexts. Something like WGL_ARB_context_create.

What makes this unlikely is that 3.2 introduced WGL/GLX_ARB_context_create_profile, which changed the wording of context_create. Basically, it allows you to get compatibility specifications without using context_create.

Now, it is "possible" that Apple is only implementing 3.2 core, in which case you would need something like context_create to get a context. But that seems... unlikely.

malexander
02-26-2011, 04:21 PM
Not sure where macrumors is getting its information from but actual user reports don't corroborate their story.

My guess is that Apple's GL3.2 support isn't mature enough for inclusion in the developer seeds yet. It's due out in the summer, so they still have a bit of time yet.

Personally, I'd be happy with GLSL > 1.20, texture buffer objects, uniform buffer objects, primitive restart and multisample textures, in addition to the GL3 extensions Apple already has.


With the stability of Mac drivers they could take over high-end rendering.

Are you being ironic? Because I've had to work around over half a dozen OSX driver bugs in the past year (vs 2 apiece from Nvidia and AMD/ATI). Yes, part of that was due to a deferred renderer codepath :)

Alfonse Reinheart
02-26-2011, 05:13 PM
Because I've had to work around over half a dozen OSX driver bugs in the past year (vs 2 apiece from Nvidia and AMD/ATI).

Really? I've never done MacOSX development, but I always assumed that their OpenGL drivers were pretty stable. I'm curious to know what you had to work around.

JoshKlint
02-26-2011, 08:23 PM
Are you being ironic? Because I've had to work around over half a dozen OSX driver bugs in the past year (vs 2 apiece from Nvidia and AMD/ATI). Yes, part of that was due to a deferred renderer codepat
Did their drivers alter behavior at all during that entire time? I didn't say they were compliant. I said they were stable.

malexander
02-27-2011, 08:33 AM
I'm curious to know what you had to work around.

I work on a cross-platform CAD app (Windows, Linux, OSX) and have a Mac Pro triple booted between the three (latest OSX version, Ubuntu 9.10 64b, Win7 64b). Starting with 10.6.3, we began encountering a lot of regressions, until 10.6.5. Since OpenGL performance increased substantially between those releases, I'm guessing a lot of the issues were related to optimization work, and things have calmed down a lot since then. Often a regression would affect either Nvidia hardware or AMD hardware, but rarely both.

Most of the issues appeared to be GLSL compiler or FBO related, ranging from crashes that required an application features to be disabled, to minor issues that could be worked around by altering the shaders. Anyway, here are the ones I remember:

- Writing to more than one buffer in an FBO on Nvidia hardware produced a few random garbage pixels where polygons should have been, and nothing else. A crash would shortly follow a frame or two later. In another case, the entire OS desktop would go into a "flicker fit" where it would refresh properly down to a certain random scanline that changed with each frame, then display black. A reboot was required. At first I thought it was a bad card, but it was reproducible on other systems with Nvidia cards. AMD hardware on OSX was fine, as was Windows/Linux.

- gl_FrontFacing stopped working on AMD cards and we had to roll our own using GLSL's frontfacing() function.

- accessing gl_LightSource[] in a constant loop didn't work for awhile on Nvidia, requiring us to unroll it. A OSX version or two later, we had to consolidate the lighting loop into the same fragment shader as main() otherwise some uniforms that were definitely used were not in the active uniform list (the ones near the end of the list alphabetically, but within the max uniform limit).

- in the August graphics update, on AMD hardware I had to do a glGetTexImage on textures drawn to by an FBO render that weren't attached to GL_COLOR_ATTACTMENT0 (depth attachment, color attachments 1+), otherwise they contained garbage when accessed by a shader later in the redraw. glFinish() didn't do the trick either, only reading pixels from the texture fixed the issue.

- Early last year, uniform arrays (uniform int foo[8]) were suddently reported in the active uniform list as foo[0] rather than foo.

- In one fragment shader with an 'if(expr) discard;' statement, I had to also assign a dummy value to gl_FragColor before the discard otherwise the fragment shader displayed black for all pixels.

We reported these issues to Apple as we tracked them down, and most of these issues were resolved within an update or two, but it was a pretty rocky road for a bit. I began to dread OSX updates :) None of these problems affected the same system booted into Linux or Windows using AMD or Nvidia's drivers.


Did their drivers alter behavior at all during that entire time? I didn't say they were compliant. I said they were stable.

Because of my experience, I couldn't tell if you were being straight up with your comment or not. So I asked :)

I don't believe Apple intentionally altered the GL behaviour, but it definitely altered the behaviour of our application. On the bright side, our app is no longer sluggish on OSX, as it was prior to 10.6.4. So, progress :)

bugmenot
03-01-2011, 07:00 AM
Some good news on the Mac OpenGL 3.2 Front at least for ATI:



NSOpenGLPixelFormatAttribute attributes [] = {
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
NSOpenGLPFADoubleBuffer, // double buffered
NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute)16, // 16 bit depth buffer
(NSOpenGLPixelFormatAttribute)nil
};

GL_VERSION = "3.2 ATI-7.0.52"
GL_SHADING_LANGUAGE_VERSION = "1.50"
GL_RENDERER= "ATI Radeon HD 4850 PRO OpenGL Engine"

from x2000gldriver:

ATI Technologies Inc.
2.1 ATI-7.0.52
3.2 ATI-7.0.52


http://netkas.org/?p=642

malexander
03-01-2011, 09:54 AM
I'm hoping there is a NSOpenGLProfileVersion3_2Compatibility as well, or similar attribute pair to request a compatibility profile.

przemo_li
07-21-2011, 11:08 AM
Apple is silent about compatibility mode.

But you can still use 2.1.

malexander
07-22-2011, 01:21 PM
I'm pretty doubtful that we'll ever see a compatibility context from Apple. They prefer to keep developers current, it seems (64b apps were unavailable using the 32b-only Carbon API; if you needed 64b, you had to switch to Cocoa). I was hoping I could avoid a large sweep to update all the trivial rendering codepaths that used removed GL features, since all the other platforms we support have the compatibility extension.

JoshKlint
11-16-2011, 02:37 PM
FYI I got word back from the driver dev team, and apparently I am not sharing textures between contexts correctly. So AFAIK this will work, although I haven't implemented the fix yet.