PDA

View Full Version : OpenGL 3.2 vs. 3.3?



JoshKlint
03-20-2011, 03:29 PM
What are the main differences between OpenGL 3.2 and 3.3?

Alfonse Reinheart
03-20-2011, 03:49 PM
The OpenGL specification has a whole section on what changed between each (recent) version.

Also, how do you define "main differences?"

In terms of actual functionality (doing something that wasn't possible on 3.2), the most substantive is dual-source blending. Maybe instance arrays. In terms of how you use OpenGL however, I'd say the biggest change is explicit attribute locations. I would have also said sampler objects, but ATI hasn't bothered to implement it correctly yet, so I haven't been able to use it.

JoshKlint
03-20-2011, 05:37 PM
So all the attribute location stuff in GLSL 3.3 does not work the same way in OpenGL 3.2?

I'm asking because it looks like Apple is only going to support OpenGL 3.2 in OSX Lion, and I am using 3.3 as my main target.

Alfonse Reinheart
03-20-2011, 05:53 PM
So all the attribute location stuff in GLSL 3.3 does not work the same way in OpenGL 3.2?

Yes.


I'm asking because it looks like Apple is only going to support OpenGL 3.2 in OSX Lion, and I am using 3.3 as my main target.

Nobody ever said that Apple was blessed with a great amount of intellect. I suppose there's a chance they'll support ARB_explicit_attrib_location. But honestly, there's no reason to support 3.2 instead of 3.3, since all 3.2-capable hardware can do 3.3.

DarkGKnight
03-21-2011, 10:09 AM
Nobody ever said that Apple was blessed with a great amount of intellect ...

Isn't the main issue that Apple has to implement software paths for hardware that is incapable of full OpenGL 3 support (like older Intel chipsets). Mesa has yet to include OpenGL 3.0 support, so either it isn't a trivial task or their developers are lazy.

V-man
03-21-2011, 01:05 PM
What would be the point of providing software emulation of GL 3 for non-capable Intel parts. That doesn't make any sense.

No, they shouldn't.

DarkGKnight
03-21-2011, 02:23 PM
Hasn't this always been the case with OpenGL implementations? Whether it is texture formats that aren't natively supported by the hardware, or GLSL features (such as if-statements on Shader Model 2.0 hardware), the only indication that a feature wasn't natively supported was when it ran slower.

I'm referring to the Intel parts that have D3D10 drivers, e.g. Intel HD Graphics, but are limited to OpenGL 2.1 support on Windows. Theoretically, it should be possible for these parts to support OpenGL 3, but Intel has opted to limit them to OpenGL 2.1.

Alfonse Reinheart
03-21-2011, 02:28 PM
Hasn't this always been the case with OpenGL implementations?

In this case, there's no need to. Just say that non-GL 3.0 hardware can't run GL 3.0. It's really that simple. You don't see GeForce 6000's providing GL 3.0 support, because they can't.

Apple is under no requirement to provide GL 3.0 on hardware that simply can't do it.

DarkGKnight
03-21-2011, 02:35 PM
As I said, I was referring to cards that have D3D10 capability, but not GL3 drivers from their manufacturer.

kRogue
03-21-2011, 03:31 PM
Isn't the main issue that Apple has to implement software paths for hardware that is incapable of full OpenGL 3 support (like older Intel chipsets). Mesa has yet to include OpenGL 3.0 support, so either it isn't a trivial task or their developers are lazy.


Take a look at http://developer.apple.com/graphicsimaging/opengl/capabilities/, you will notice that some of the hardware in that matrix only does GL 1.4, so I don't *think* that software emulation for GL3 features for non-GL3 capable parts is an issue.

What is frightening is that Apple's new MacBooks has a GL 4 capable part but Apple is still (as of this very moment) still only supporting up to GL2.1. Makes me wonder is a dark way :whistle:

Though, you should be able to put Ubuntu (or another Linux distribution) on parallel with Mac OS-X (likely need boot camp or such so get the dual boot to work out well), once there, then yo can get the ATI (or NVIDIA) binary drivers and unlock the hardware for GL programming (though then no X-code).

DarkGKnight
03-21-2011, 04:05 PM
Take a look at http://developer.apple.com/graphicsimaging/opengl/capabilities/

Do you notice that Apple includes a software renderer in that matrix?

V-man
03-21-2011, 04:36 PM
Seems like the "software renderer" is something that you select during context creation.
Furthermore, the "software renderer" is probably just for testing and comparing the output with a hardware renderer.

If you were to insert some old card such as a Geforce 2 (if it was possible), the Mac would expose the Geforce 2 as a GL 1.2 device and the software renderer as a GL 3.2 device.

DarkGKnight
03-22-2011, 12:08 AM
Seems like the "software renderer" is something that you select during context creation.
Furthermore, the "software renderer" is probably just for testing and comparing the output with a hardware renderer.
Similar to Direct3D's Reference device. However, it brings home the point that Apple can't simply support the latest version of OpenGL without an equivalent software renderer.

kRogue
03-22-2011, 02:06 AM
Similar to Direct3D's Reference device. However, it brings home the point that Apple can't simply support the latest version of OpenGL without an equivalent software renderer.


No, that there are hardware devices that only support GL1.4 means it is ok. But in all brutal honesty, only a mad man would ship an application that uses the software renderer.

Additionally, I highly doubt that the idea of using the software implementation to test it against a hardware implementation for the simple fact that a software implementation would have just different bugs/issues as a hardware renderer.

Though, if I remember correctly, under MS-Windows, there are a set of registry keys to give NVIDIA pre-GL3 cards GL3 features via software emulation.. I stress though this is I *think* not I know.

Ilian Dinev
03-22-2011, 02:57 AM
Though, if I remember correctly, under MS-Windows, there are a set of registry keys to give NVIDIA pre-GL3 cards GL3 features via software emulation.. I stress though this is I *think* not I know.

http://developer.nvidia.com/object/nvemulate.html

Works as advertised on GF6100 & GF7600, and even 8x00+

DarkGKnight
03-22-2011, 03:10 AM
No, that there are hardware devices that only support GL1.4 means it is ok. But in all brutal honesty, only a mad man would ship an application that uses the software renderer.



For whatever reasons Apple included a software renderer for OpenGL 2.1 in Leopard will be the same reasons for them to include a software renderer for OpenGL 3.2 in Lion.

Alfonse Reinheart
03-22-2011, 04:00 AM
However, it brings home the point that Apple can't simply support the latest version of OpenGL without an equivalent software renderer.

Yes. But that doesn't change the facts that:

1: OpenGL 3.3 has been out for a year, and Apple sits on the ARB. They've known the scope of GL 3.3 for longer than we have.

2: The differences between 3.2 and 3.3 aren't that big. Certainly nothing like the differences between, say, 2.1 and 3.2, which is they're having to do now.

3: There's no rule that says that their software renderer absolutely must support the maximum GL version they allow.

So the "needs a software renderer" thing is not really an excuse.

DarkGKnight
03-22-2011, 04:53 AM
1. Intel sits on the ARB, and their GL version for Sandy Bridge is 3.0. In another thread, there is a complaint that AMD's OpenGL 3.3 support is buggy/incomplete, even though their driver advertises 4.1 support.

2. It is easy to type "the differences between 3.2 and 3.3 aren't that big", but we aren't writing the implementations. I'm sure you would agree that it's better to have a verison 3.2 that is fully functional, than a version 3.3 that is buggy.

3. I suspect that is the reason why Apple's current OpenGL implementation is still version 2.1, regardless of using newer hardware. Given that OpenGL is a system component on Mac's (like DX on Windows), they have to ensure that the implementation works as expected (i.e. write the software implementation themselves), rather than outsource the work to IHVs, who will tailor the implementation to their hardware.

We can only speculate what factors Apple uses to determine which version of OpenGL goes into their Operating System. It would have been great if they included 3.3, and it may appear in a future update of Lion, but I suspect that even if they did include version 3.3 at launch, the current argument could be used to state that they should have included version 4.0 ...

Alfonse Reinheart
03-22-2011, 05:36 AM
Intel sits on the ARB, and their GL version for Sandy Bridge is 3.0. In another thread, there is a complaint that AMD's OpenGL 3.3 support is buggy/incomplete, even though their driver advertises 4.1 support.

The failure of others to properly do their job does not absolve Apple from failing to properly do theirs.


It is easy to type "the differences between 3.2 and 3.3 aren't that big", but we aren't writing the implementations. I'm sure you would agree that it's better to have a verison 3.2 that is fully functional, than a version 3.3 that is buggy.

1: See above.

2: We are talking about a software implementation, which I personally would never use. So I can't exactly say that I would much care.


Given that OpenGL is a system component on Mac's (like DX on Windows), they have to ensure that the implementation works as expected (i.e. write the software implementation themselves), rather than outsource the work to IHVs, who will tailor the implementation to their hardware.

And yet, Microsoft of all people has somehow managed to not only have a D3D 10 implementation (approximately equal to GL 3.3), but also a D3D 11 implementation (~GL 4.1).


but I suspect that even if they did include version 3.3 at launch, the current argument could be used to state that they should have included version 4.0 ...

Except that one of the arguments used was specifically that 3.2 isn't that different from 3.3 (which is true). 4.0 has pretty substantial differences from 3.3. It introduces 2 brand new shader stages, among various other things.

I would also point out that the 2.1 + extensions that Apple supports is already functionally equivalent to 3.2. Sure, UBOs are much better than bindable-uniform, and the core Geometry shader spec is much nicer to use than the extension version, but all of the actual functionality is there one way or another. So implementing 3.2 really is offering no actual functionality to users; just a nicer API.

So it's not exactly stressful on their development department to simply change the APIs for accessing functionality, is it? Sure, the bindable uniform to UBO change will take some coding, but far less than if they were implementing UBOs from scratch. And so on.

I don't know why you're trying to make excuses for Apple's behavior here. We all know why Apple hasn't updated OpenGL recently: iOS. MacOSX isn't a priority for them anymore.

DarkGKnight
03-22-2011, 07:03 AM
And yet, Microsoft of all people has somehow managed to not only have a D3D 10 implementation (approximately equal to GL 3.3), but also a D3D 11 implementation (~GL 4.1).

Microsoft sets the standard for 3D graphics on the desktop/laptop. Once you set the standard (D3D10/11), copycats (OGL 3.X/4.X) will have to wait months, if not years, to catch up in terms of implementation quality. Look at the iPad vs its competition.


We all know why Apple hasn't updated OpenGL recently: iOS. MacOSX isn't a priority for them anymore.
Money is a motivating factor for any venture, and most commercial OpenGL developers are probably targeting the iOS platform.

I was hoping for an announcement of an OpenGL ES 3.X spec at this year's GDC, but doesn't appear to be on anyone's radar. Maybe we have to wait for Microsoft to bring such D3D10-features to the next version of Windows Phone before Khronos makes a move.

malexander
03-22-2011, 08:45 AM
I would imagine that Apple has announced OpenGL 3.2 support because that's what the current 10.7 development seeds have support for. Apple isn't known for releasing details of future products. I wouldn't hold my breath for GL 3.3 though - besides GL features, they still have some catch-up to do regarding GL performance.

Apple's aware of the fact that they've let up on OSX development in the past few years, as is evidenced by their Back to the Mac (http://www.engadget.com/2010/10/20/live-from-apples-back-to-the-mac-event/) event last October. Hopefully they mean it.

Alfonse Reinheart
03-22-2011, 11:27 AM
Microsoft sets the standard for 3D graphics on the desktop/laptop. Once you set the standard (D3D10/11), copycats (OGL 3.X/4.X) will have to wait months, if not years, to catch up in terms of implementation quality.

And yet, that didn't stop NVIDIA from shipping D3D-10-level extensions pretty much the day the GeForce 8000's hit the market.


I was hoping for an announcement of an OpenGL ES 3.X spec at this year's GDC, but doesn't appear to be on anyone's radar. Maybe we have to wait for Microsoft to bring such D3D10-features to the next version of Windows Phone before Khronos makes a move.

In the mobile space, there's a general stagnation about hardware features. In desktops, performance generally pushed features; take the Radeon 9700. It sold well not because it ushered in the age of D3D9-class cards, but because it was so much faster than the competition at the time.

In the mobile space, power consumption rules all. Any new features you add take up die space and power, so unless new battery technology happens that suddenly gives you lots more power, you're not going to see significant jumps in features happening that often. The mobile space favors slow evolution of hardware.

Combine this with the simple utility of extensions. The base ES 2.0 platform represents some core functionality, but most ES platforms also expose a number of extensions. This represents the ability to select features as needed, allowing your engine to be responsive to different featuresets on different hardware.

All an ES 3.0 would do is specify a new base featureset. Desktop OpenGL can get away with that because Microsoft is out there defining specific levels of hardware. But in the mobile space, you have PowerVR and... well, that's it. NVIDIA's trying to get involved with Tegra, but they're still basically following PowerVR's featureset.

Or, to put it another way, there's really no need for ES 3.0. The most it might do is standardize existing practice (promoting to core a commonly used set of extensions, like FBO). This would have little effect on users.

DarkGKnight
03-22-2011, 12:08 PM
And yet, that didn't stop NVIDIA from shipping D3D-10-level extensions pretty much the day the GeForce 8000's hit the market.

When you set the standard (i.e. the first manufacturer with D3D10 hardware), you are able to do things like that. OpenGL 3 was based around D3D10, not the other way around. Thus, when NVIDIA exposed D3D10 functionality to OpenGL in 2006, Khronos simply had to say: that extension/functionality is required for OpenGL 3 (late 2007).


Or, to put it another way, there's really no need for ES 3.0. The most it might do is standardize existing practice (promoting to core a commonly used set of extensions, like FBO). This would have little effect on users.
Actually, I'd like to see support for integer textures (EXT_texture_integer) in the mobile space. That is one OGL3/D3D10 feature that would be useful.

EDIT: I just noticed that the PowerVR's SGX545 (http://www.imgtec.com/news/Release/index.asp?NewsID=516) has support for OGL3.2/D3D10.1 on embedded chipsets. Until the iPhone/iPad starts using it, those features won't migrate into the mobile space.

kyle_
03-22-2011, 01:26 PM
Actually, I'd like to see support for integer textures (EXT_texture_integer) in the mobile space. That is one OGL3/D3D10 feature that would be useful.

May i as whats so appealing in integer textures and which formats specifically?
I mean, [u]int32 variants are pretty much all useless since introduction of floatBitsToInt (which as far as i know can be implemented everywhere where true int's are supported).
Of course extra sugar doesnt hurt, but doesnt really enable anything either.

DarkGKnight
03-22-2011, 01:54 PM
May i as whats so appealing in integer textures and which formats specifically?

When trying to retrieve the original integer value of a 16-bit texel in a shader whose value has been normalized as a float (via glTexImage*D), converting it back to an integer leads to rounding errors. The 16-bit format GL_R16UI/DXGI_FORMAT_R16_UINT prevents this issue.

Alfonse Reinheart
03-22-2011, 02:33 PM
When you set the standard (i.e. the first manufacturer with D3D10 hardware), you are able to do things like that. OpenGL 3 was based around D3D10, not the other way around.

And Apple helped set the OpenGL 3.3 standard (among others); they're on the ARB. Therefore by your own logic, they should have been able to have a 3.3 implementation out there before now.


Actually, I'd like to see support for integer textures (EXT_texture_integer) in the mobile space. That is one OGL3/D3D10 feature that would be useful.

My point is that you don't need a GL ES 3.0 specification for that. All you need is an ES version of EXT_texture_integer. Collecting functionality into an ES 3.0 doesn't have any more meaning than just checking for EXT_texture_integer. Mobile hardware simply isn't grouped into nice levels of functionality the way D3D provides for desktop hardware.


When trying to retrieve the original integer value of a 16-bit texel in a shader whose value has been normalized as a float (via glTexImage*D), converting it back to an integer leads to rounding errors. The 16-bit format GL_R16UI/DXGI_FORMAT_R16_UINT prevents this issue.

I think he's wondering what you need integers for to begin with. That is, what you're doing that you need integer lookup tables.

DarkGKnight
03-22-2011, 03:37 PM
And Apple helped set the OpenGL 3.3 standard (among others);

That is the problem. We've seen that Microsoft's dictator-style approach has made them the current innovator of 3D graphics on the PC. If Apple were in Microsoft's position, we would be having previews of OpenGL 5 in the latest preview of Lion.



I think he's wondering what you need integers for to begin with. That is, what you're doing that you need integer lookup tables.
Emulating old hardware.

The reason I refer to the mythical specification as OpenGL ES 3.X is because it would be a subset of OpenGL 3.X functionality, similar to how ES 2.0 is a subset of GL 2.0 functionality.

Subsuming an extension into a core specification is the only sure way that it will be supported by multiple manufacturers. It is easier to list OpenGL ES 3.X as requirement, rather than EXT_texture_integer.

Zenja
04-07-2011, 03:36 AM
The ironic bit about OSX OpenGL > 3.x support, is that the iPhone developer kit (and simulator) support OpenGL ES 2.0, which is 90% of the OpenGL 3.2 core profile. Snow Leopard already renders that in the iPhone simulator. So Apple have a running 3.x OpenGL stack on OS X Snow Leopard. But only for iOS software.

kRogue
04-07-2011, 07:01 AM
The ironic bit about OSX OpenGL > 3.x support, is that the iPhone developer kit (and simulator) support OpenGL ES 2.0, which is 90% of the OpenGL 3.2 core profile. Snow Leopard already renders that in the iPhone simulator. So Apple have a running 3.x OpenGL stack on OS X Snow Leopard. But only for iOS software.


Could not getting any wronger. OpenGL ES 2.0 is basically a stripped version of OpenGL 2.1 with limited FBO support. The following GL3 features are not in OpenGL ES2.0: integer texture support, native integers in shaders (including bit operations), texture buffer objects, uniform buffer objects, multiple render targets, floating point textures, transform feedback, instancing... actually what I am doing here is basically listing everything that is in GL3 that is not in GL2.

There is only one feature in GLES2.0 that is not in GL2.1: render to texture, and even at that, unextented GLES2 spec only allow one to render to mipmap level 0.

So just to repeat:


OpenGL ES 2.0= Limited FBO + GL2.1 - fixed function pipeline - complete_non-power_2_texture_support - most_read_back_from_GL


[in unextended GLES2 non-power-2 textures are supported with no mipmapping and limited texture wrap modes]

[GLES2 includes no public API points to read directly from a texture of a buffer object, in fact mapping a buffer object is an extension in GLES2 and it is write only for that matter. The only realistic way to read iamge data from a texture in GLES2 is via FBO's+glREadPixels, but that only will work for RGB and RGBA textures]

What you can do with GLES2 (if you are into that kind of thing) is write a GLES2 application and it will mostly work in GL3 core profile.. but that is not really true either as GLES2 allows one to source index and attribute data from client side memory (i.e. sourcing from buffer objects is not required).

kRogue
04-07-2011, 07:03 AM
The ironic bit about OSX OpenGL > 3.x support, is that the iPhone developer kit (and simulator) support OpenGL ES 2.0, which is 90% of the OpenGL 3.2 core profile. Snow Leopard already renders that in the iPhone simulator. So Apple have a running 3.x OpenGL stack on OS X Snow Leopard. But only for iOS software.


Could not getting any wronger. OpenGL ES 2.0 is basically a stripped version of OpenGL 2.1 with limited FBO support. The following GL3 features are not in OpenGL ES2.0: integer texture support, native integers in shaders (including bit operations), texture buffer objects, uniform buffer objects, multiple render targets, floating point textures, transform feedback, instancing... actually what I am doing here is basically listing everything that is in GL3 that is not in GL2.

There is only one feature in GLES2.0 that is not in GL2.1: render to texture, and even at that, unextented GLES2 spec only allow one to render to mipmap level 0.

So just to repeat:


OpenGL ES 2.0= Limited FBO + GL2.1 - fixed function pipeline - complete_non-power_2_texture_support - most_read_back_from_GL


[in unextended GLES2 non-power-2 textures are supported with no mipmapping and limited texture wrap modes]

[GLES2 includes no public API points to read directly from a texture of a buffer object, in fact mapping a buffer object is an extension in GLES2 and it is write only for that matter. The only realistic way to read iamge data from a texture in GLES2 is via FBO's+glREadPixels, but that only will work for RGB and RGBA textures]

What you can do with GLES2 (if you are into that kind of thing) is write a GLES2 application and it will mostly work in GL3 core profile.. but that is not really true either as GLES2 allows one to source index and attribute data from client side memory (i.e. sourcing from buffer objects is not required).

V-man
04-07-2011, 07:35 AM
GL ES 2.0 is GL ES 1.1 + GLSL.

GL ES 1.1 is basically a striped out version of GL 1.5.

Even the GLSL part is stripped down. The built-in crap is gone. Simplifies the driver a lot.

Xmas
04-08-2011, 11:16 AM
GL ES 2.0 is GL ES 1.1 + GLSL.
That's a pretty bad description as ES2.0 is not backwards compatible to ES1.1. The fixed function pipeline is gone, among other things, while more than GLSL was added.

kRogue
04-08-2011, 01:09 PM
Like I said,



OpenGL ES 2.0= OpenGL2.1 + LimitedFBO - FixedFunctionPipeline - MipmapSupportForNonPower2Textures - MostReadBackSupport