PDA

View Full Version : Intel and multisampling



Aleksandar
09-21-2012, 08:52 AM
Well, I knew that Intel had problem with graphics cards and OpenGL, but this is ridiculous.
Neither G41 Expres nor HM55 supports multi-sampling!!!

May I remind you that GL_ARB_multisample is a part of GL 1.3 specification? :(
The applications look awfully on both adapters. Is there any help?

I also need a confirmation that Intel i5-540M does not have integrated graphics card within the processor.
Currently it uses HM55 chipset graphics.

Another warning: The latest 3rd generation Intel Core processors with Intel HD Graphics 2500 supports only GL3.3!
To be more precise, the latest drivers (8.15.10.2761) do not support:
- GL_ARB_texture_mirrored_repeat (GL1.4)
- GL_EXT_gpu_shader4 (GL3.0)
- GL_NV_depth_buffer_float (GL3.0)
- GL_NV_half_float (GL3.0)
- GL_EXT_texture_compression_rgtc (GL3.0)
- GL_EXT_framebuffer_sRGB (GL3.0)
- GL_ARB_texture_buffer_object (GL3.1)
- GL_ARB_shading_language_include (GL3.3)
- GL_ARB_texture_swizzle (GL3.3)

Well, almost GL 3.3. :(

Alfonse Reinheart
09-21-2012, 12:13 PM
Can you clarify how it does not "supports multi-sampling!!!"?


GL_ARB_shading_language_include (GL3.3)

That is not part of 3.3, or any other core OpenGL version.

malexander
09-21-2012, 01:00 PM
The GL spec gives a minimum value for sample buffers of 0, so I think all that is required of the implementation is that it support the multisample interface (GL_MULTISAMPLE, GL_SAMPLES, etc). Does it throw bad enum errors?

Every time I get my hands on a new iGPU from Intel I fire up our app and see what's amiss. With the latest driver for the HD4000 (i7 3570), I got a crash using glGetActiveUniformsiv() (switched to the older glGetActiveUniformName()), then ran into an issue where it wouldn't attach a single-channel texture using an FBO. After promoting it to RGBA, things were better than previous attempts (with an i5 661 2 years ago), but there were still obvious rendering errors. The situation seems to be improving, albeit slowly.

arekkusu
09-21-2012, 01:19 PM
The GL spec gives a minimum value for sample buffers of 0.

SAMPLE_BUFFERS and SAMPLES are framebuffer-dependent state, so can certainly be zero.
However, 3.0 and later require MAX_SAMPLES to be at least 4.

Any renderer claiming GL_VERSION = 3.3 needs to support multisampling, along with the core functionality that was promoted from EXT_gpu_shader4, etc etc (even if those extension strings aren't exported.)

Aleksandar
09-22-2012, 06:47 AM
Can you clarify how it does not "supports multi-sampling!!!"?
1. GL_ARB_multisample is not in the extension list
2. There is no pixel format that supports multisampling (I have tried all values for the WGL_SAMPLES_ARB from 16 down to 0)
3. Neither Qt succeeded to draw anti-aliased lines using OpenGL.


That is not part of 3.3, or any other core OpenGL version.
Technically you are right, but this extension is published with GL3.3 and is supported by GL3.3 drivers on NV cards.

malexander, did you succeed to draw “smooth” scene on pre-HD4000 Intel’s card?

Another interesting observation: Intel GMA 3150 with 8.14.10.2230 driver supports only OpenGL 1.4! (I would say that it is GL1.5 after all, since only GL_ARB_occlusion_query is missing, but the driver reports 1.4). Notebookcheck (http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-3150.23264.0.html) claims there are 2 pixel shader units, and none vertex shader units, while Wikipedia claims there are two unified shaders. The truth is GMA 3150 does not support GL shaders at all.

Debug_output is probably supported only on HD4000 series. On the HD2500 it is not. Although the drivers-pack is the same for both HD4000 and HD2500, HD2500 does not support even many feature from the pre-GL3.3 versions.

mhagain
09-22-2012, 07:24 AM
I've found with older Intels that they actually do support the GL_ARB_vertex_program and GL_ARB_fragment_program extensions, so the GMA 3150 most likely does too.

Interesting note re: occlusion queries. Technically this part can support them and report GL1.5 since the spec allows for QUERY_COUNTER_BITS to be 0 - this was actually explicitly added to the spec to enable Intel to report 1.5 support - see http://www.opengl.org/archives/about/arb/meeting_notes/notes/meeting_note_2003-06-10.html#oglnext

malexander
09-22-2012, 07:48 PM
malexander, did you succeed to draw “smooth” scene on pre-HD4000 Intel’s card?

I don't recall it being stable enough at the time to check. Now that CPU is running Ubuntu + Mesa, which does support GL_ARB_multisample. I'd have to do a bit more investigating to see how many samples (if any) it supports. I unfortunately don't have that system dual-booted with Windows, so it'd be very difficult to check Intel's driver.

The HD4000 with the latest drivers (May'12) supports GL_ARB_multisample and multisample textures with 8 depth/color samples, FWIW. It's been awhile since I used OS framebuffer multisampling, so I'm not certain the GL_SAMPLES value is the same.

Aleksandar
09-24-2012, 01:14 PM
I've found with older Intels that they actually do support the GL_ARB_vertex_program and GL_ARB_fragment_program extensions, so the GMA 3150 most likely does too.

I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!
GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
Conclusion: GMA 3150 is moldy.

I'll repeat tests for the rest of Intel's GPUs. So far only on HD2500+ GL works more or less correctly.


Interesting note re: occlusion queries. Technically this part can support them and report GL1.5 since the spec allows for QUERY_COUNTER_BITS to be 0 - this was actually explicitly added to the spec to enable Intel to report 1.5 support - see http://www.opengl.org/archives/about/arb/meeting_notes/notes/meeting_note_2003-06-10.html#oglnext
What’s the point of that?

mhagain
09-24-2012, 04:43 PM
I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!

GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
Conclusion: GMA 3150 is moldy.
That more or less matches with my prior experience on Intels. One could argue that they are "unified" in so far as what's running on the hardware is concerned, but that would be stretching credibility a little. On a HD4000 I get 8 samples with 1 quality level, GL3.3 and D3D11, but generally only the HDs are anyway decent-ish; any of the GMA stuff is incredibly basic.


What’s the point of that?

Precisely.

Aleksandar
09-28-2012, 04:45 PM
Overview of Intel's GL support

I have to apologize in advance because of the length of this post, but I had to summarize what I discovered these days.

===========================================
Intel GMA 3150 (Pineview)
Release: Q1'10 - Atom N4xx/N5xx - 3rd generation Intel GPU
Shader model: 3.0(sw)/2.0 (2PU)
-------------------------------------------
OpenGL version: 1.4.0 - Build 8.14.10.2230
Renderer: Intel Pineview Platform
Date: 24.10.2010. (latest)
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 66% - 2/3 ]
OpenGL 2.0: [ 11% - 1/9 ]
OpenGL 2.1: [ 0% - 0/2 ]
OpenGL 3.0: [ 0% - 0/21 ]
OpenGL 3.1: [ 0% - 0/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 23% - 33/143 ]

===========================================
Intel(R) G41 Express (GMA X4500)
Release: Q3'08 - 4th generation Intel GPU
Shader model: 4.0 (10PU)
-------------------------------------------
OpenGL version: 2.1.0 - Build 8.15.10.1986
Renderer: Intel(R) G41 Express Chipset
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 23% - 5/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 36% - 51/143 ]
-------------------------------------------
OpenGL version: 2.1.0 - Build 8.15.10.2555
Renderer: Intel(R) G41 Express Chipset
Date: 19.10.2011. (latest)
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 46% - 66/143 ]

===========================================
Intel GMA HD - i5 540M
Release: Q1'10 - Ironlake - 5th generation Intel GPU
Shader model: 4.0 (6PU)
-------------------------------------------
OpenGL version: 2.1.0 - Build 8.15.10.2021
Renderer: Intel(R) Graphics Media Accelerator HD
Date: 30.12.2009.
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 23% - 5/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 36% - 51/143 ]
-------------------------------------------
OpenGL version: 2.1.0 - Build 8.15.10.2622
Renderer: Intel(R) HD Graphics
Date: 21.01.2012. (previously released)
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 55% - 5/9 ]
OpenGL 3.3: [ 10% - 1/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 53% - 76/143 ]
-------------------------------------------
OpenGL version: 2.1.0 - Build 8.15.10.2827
Renderer: Intel(R) HD Graphics
Date: 10.08.2012. (latest)
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 55% - 5/9 ]
OpenGL 3.3: [ 10% - 1/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 53% - 76/143 ]

===========================================
Intel HD 2500
Release: Q1'12 - Ivy Bridge - 7th generation Intel GPU
Shader model: 5.0 (6PU)
-------------------------------------------
OpenGL version: 3.3.0 - Build 8.15.10.2761
Renderer: Intel(R) HD Graphics
Date: 24.05.2012. (latest)
-------------------------------------------
OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 100% - 9/9 ] Multisampling!!! (finally)
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 100% - 9/9 ]
OpenGL 3.3: [ 80% - 8/10 ]
OpenGL 4.0: [ 23% - 3/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 64% - 91/143 ]

Conclusions:
- GMA3150 is worse than GF2! It is a luxury of Intel to release a 3rd generation GPU after introduction of its 5th generation GPUs.
- Intel still supports 4th generation GPUs. Three years after the first release, new drivers bring 8 new extensions.
- Although Wikipedia claims HD 2500/4000 supports GL4.0 since 2729, it is not what we have. I need a confirmation for HD 4000.

Can anybody post statistics for HD 4000 (Ivy Bridge) and/or HD 2000/3000 (Sandy Bridge)?

Alfonse Reinheart
09-28-2012, 07:09 PM
Did you look at what extensions were missing from the 3.0/3.1/3.3 cases for the HD2500?

mhagain
09-28-2012, 07:20 PM
For a HD 4000, driver version 8.15.10.2618, using GL Extensions Viewer 4.0.8 I get:

v1.1 (100 % - 7/7)
v1.2 (100 % - 8/8)
v1.3 (100 % - 9/9)
v1.4 (100 % - 15/15)
v1.5 (100 % - 3/3)
v2.0 (100 % - 10/10)
v2.1 (100 % - 3/3)
v3.0 (100 % - 23/23)
v3.1 (100 % - 8/8)
v3.2 (100 % - 10/10)
v3.3 (100 % - 10/10)
v4.0 (21 % - 3/14)
v4.1 (0 % - 0/7)
v4.2 (0 % - 0/12)
v4.3 (0 % - 0/18)

Multisampling is present and correct, with up to 8 max samples.

The 3 extensions from 4.0, if anyone is interested, were GL_ARB_draw_buffers_blend, GL_ARB_texture_buffer_object_rgb32 and GL_ARB_texture_query_lod.

Aleksandar
09-29-2012, 02:37 AM
Thank you, mhagain! Could you report the version of your driver?

It seems neither HD supports GL 4.0, although Geeks3D (http://www.geeks3d.com/20120506/intel-hd-graphics-driver-v2729-with-opengl-4-support-and-new-opengl-extensions/) reported something quite differently.
HD 2500 and 4000 should have the same architecture, and I couldn't find traces of tessellation shaders
The results on HD 2500 and HD 4000 are identical. Is it possible that only 2696 has GL 4.0 support that is removed from the following revisions?


Did you look at what extensions were missing from the 3.0/3.1/3.3 cases for the HD2500?
Of course! But I have to revise my extension viewer considering GL 3.0. It's an old code rely on the first implementation in NV drivers. That's why I changed some numbers in the previous post. Thank you for the question! It makes me check "my understanding" of GL versions. I'll revise it completely.

Considering GL 3.1, GL_ARB_texture_buffer_object is not supported in 8.15.10.2761 drivers, and GL_ARB_shading_language_include, GL_ARB_texture_swizzle from GL 3.3.

Should GL_EXT_gpu_shader4 be reported in GL 3.0?

randall
09-29-2012, 03:03 AM
It seems that you are not using latest drivers. Ivy Bridge has complete support for OpenGL 4.0.

http://www.geeks3d.com/20120716/intel-hd-graphics-driver-v2792-for-win7-and-win8/

mhagain
09-29-2012, 06:11 AM
Thank you, mhagain! Could you report the version of your driver?

Post edited.

Aleksandar
09-29-2012, 07:39 AM
It seems that you are not using latest drivers.
It seems that I'm not using Windows 8. :)

I've got the point! Take a look at drivers versions:
First GL 4.0 support at 9.17.10.2729
Another test is done with 9.17.10.2792

Those are not Win7 drivers. That's the point. The mystery is solved.
OpenGL 4.0 is supported on Win8, but not on Win7!

Although I'm a little bit disappointed by this discovery, it proves the future is bright for the OpenGL on Win8. :)
Despite all concerns Windows is and probably will be the best development platform for the OpenGL.

P.S. Thanks mhagain! You could update your driver to take a look whether there is any new extension, since it is not the latest one. And even if there are no updates, some bugs are probably fixed. ;)

mhagain
09-29-2012, 05:32 PM
P.S. Thanks mhagain! You could update your driver to take a look whether there is any new extension, since it is not the latest one. And even if there are no updates, some bugs are probably fixed. ;)

Updated to the latest (8.15.10.2761) and still the same functionality and extensions, but a wee bit faster. :)

randall
09-29-2012, 08:31 PM
Aleksandar

Those drivers can also be installed on Windows 7. OpenGL 4.0 is fully supported on Win7 and Win8.

Aleksandar
09-30-2012, 02:03 AM
I will try it tomorrow. I remember that I had some problems when I tried to install some of those drivers, but I'll try again. Last week I was a witness that Intel graphics driver cannot install at first, but it succeeds after some of the Windows updates. I'll report the results here. Thanks!

malexander
09-30-2012, 06:28 PM
GL_ARB_texture_swizzle and GL_ARB_texture_buffer_object are supported in the 8.15.10.2696 drivers, though they aren't listed in the extension string. I became a little suspicious after the latest driver for my AMD FirePro card also didn't list GL_ARB_texture_swizzle but claimed GL4.2 support and properly supported it, so I decided to recheck the HD4000. The shader with samplerBuffer compiled without incident, and the glTexBuffer API entry point was found. Swizzling also works properly ( glTexParameter(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA) ).

However, the noperspective keyword throws a syntax error in a shader in an interface block, and one of my #define's appears to be broken with the paste operator (##). Even with these bugs, the situation is certainly better than any Intel GL driver I recall.

Alfonse Reinheart
09-30-2012, 08:18 PM
GL_ARB_texture_swizzle and GL_ARB_texture_buffer_object are supported in the 8.15.10.2696 drivers, though they aren't listed in the extension string. I became a little suspicious after the latest driver for my AMD FirePro card also didn't list GL_ARB_texture_swizzle but claimed GL4.2 support and properly supported it, so I decided to recheck the HD4000. The shader with samplerBuffer compiled without incident, and the glTexBuffer API entry point was found. Swizzling also works properly ( glTexParameter(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA) ).

Don't forget: OpenGL versions are not defined by extensions. It is perfectly legitimate for an implementation to say "3.3" and not support ARB_texture_swizzle or any other 3.3 core extension. It advertises support for 3.3, therefore it supports all of 3.3, and texture swizzling is a part of it.

So I would say that using extensions to somehow "verify" core support for an advertized version is not a reliable mechanism.

malexander
09-30-2012, 08:58 PM
Sure, but there's no good reason not to list these core features as extensions, so I'm guessing that their omission is just an oversight. Personally I'd much rather check against GL version X.Y than check for the support of dozens of extensions, but that doesn't mean that one shouldn't be able to.

Alfonse Reinheart
09-30-2012, 09:49 PM
that doesn't mean that one shouldn't be able to

That doesn't mean one should be able to either. I think Apple has it right here; their 3.2 implementation only exposes extensions that matter. IE: things that aren't already in core.

The point is that you should check the version, and only check the extension if it's below the version where it was incorporated. Otherwise, actually having OpenGL versions is irrelevant; you'd just check for a bunch of extensions you use.

Aleksandar
10-01-2012, 10:55 AM
Thank you, randall!!!
9.17.10.2792 drivers can be installed on Win7 and they fully support GL 4.0. Great!


The point is that you should check the version, and only check the extension if it's below the version where it was incorporated. Otherwise, actually having OpenGL versions is irrelevant; you'd just check for a bunch of extensions you use.
I strongly disagree. All functionality should be reported through extension strings, and currently it is a default behavior of all drivers on Windows.
Take a look at an Intel's case. Multisampling is a part of GL 1.3 specification, and Intel does not support it even if it advertises GL 2.1. But, there is no trace of GL_ARB_multisample extension string, so it is a regular behavior. Imagine what will happen if Intel would say it supports GL 1.2 only and at the same time there is a full support for the shaders.

thokra
10-01-2012, 10:59 AM
I strongly disagree. All functionality should be reported through extension strings, and currently it is a default behavior of all drivers on Windows. Take a look at an Intel's case. Multisampling is a part of GL 1.3 specification, and Intel does not support it even if it advertises GL 2.1. But, there is no trace of GL_ARB_multisample extension string, so it is a regular behavior. Imagine what will happen if Intel would say it supports GL 1.2 only and at the same time there is a full support for the shaders.

Hmm, I'm with Alfonse here since core conforming drivers have to expose what's specified for the particular version they're implementing. If the extension is not reported then Intel is to blame and should fix their driver.

Alfonse Reinheart
10-01-2012, 12:10 PM
Take a look at an Intel's case. Multisampling is a part of GL 1.3 specification, and Intel does not support it even if it advertises GL 2.1. But, there is no trace of GL_ARB_multisample extension string, so it is a regular behavior.

It's "regular behavior" because it's conformant with 2.1. It supports what the 2.1 core requires: it has the machinery for multisampling. But it doesn't actually provide any multisample pixel formats. You can enable it all you like; but without the multisample pixel formats, that won't actually do anything.

Even ARB_framebuffer_object does not require an implementation to support multisampling; just the machinery for it. It allows GL_MAX_SAMPLES to be 0.

This only becomes a lie in 3.0, where GL_MAX_SAMPLES is required to be at least 4. At which point, if Intel's driver advertises 3.0+ and doesn't provide multisampling behavior, they are lying. Just as ATi was lying in all of their pre-HD graphics cards when they advertised 2.0 and 2.1 when they didn't provide full NPOT support. But rather than forcing the IHVs to actually do what the spec says, people take your advice and work around it by looking at extension lists to figure out what elements of the spec the implementation is blatantly ignoring.

Things like that are probably why we can't get a conformance test for OpenGL. Because it would actually test the specification, and some drivers refuse to actually implement what they say they do.

Lying is only "regular behavior" when you accept it as regular behavior, when you start believing that lies are the truth.


Imagine what will happen if Intel would say it supports GL 1.2 only and at the same time there is a full support for the shaders.

How could there possibly be "full support for the shaders?" ARB_shader_objects is very different from core GL 2.0 shaders. And since it advertises only GL 1.2, you should not expect the GLSL functions to be there. Even if you advertise ARB_shader_objects, there's absolutely no reason to expect `glUseProgram` (for example) to be available; the ARB_shader_objects version is called `glUseProgramObjectARB`.

I don't see how that constitutes "full support" for anything.

malexander
10-01-2012, 01:27 PM
I think Apple has it right here; their 3.2 implementation only exposes extensions that matter. IE: things that aren't already in core.

I'm okay with that approach, as well as the 'list them all' approach. But pick one; don't expose some core features in the extension string, and leave others out. That's just plain inconsistent.

randall
10-02-2012, 01:43 AM
GL_ARB_texture_buffer_object as an extension provides more functionality (more formats to be precise) than version of this feature that is mandatory for OpenGL core. This is the reason why some implementations do not expose this extension string.

malexander
10-02-2012, 06:39 AM
Ah, I never noticed that before, thanks! The GL spec appendix on Version 3.1 only briefly mentions that TBOs are included.

Makes sense that the old ALPHA, LUMINANCE and LUMINANCE_ALPHA formats mentioned in the extension aren't available for TBOs in core (only R, RG, and RGBA formats are supported).

Alfonse Reinheart
10-02-2012, 10:37 AM
Ah, I never noticed that before, thanks! The GL spec appendix on Version 3.1 only briefly mentions that TBOs are included.

It's not a core extension (hence the "ARB" slapped onto the end of everything). So it's presence shouldn't mean that the core TBO feature is available. Also, if you want to use those other formats, you can't use `glTexBuffer` with them; you must use `glTexBufferARB`.