Intel and multisampling

Well, I knew that Intel had problem with graphics cards and OpenGL, but this is ridiculous.
Neither G41 Expres nor HM55 supports multi-sampling!!!

May I remind you that GL_ARB_multisample is a part of GL 1.3 specification? :frowning:
The applications look awfully on both adapters. Is there any help?

I also need a confirmation that Intel i5-540M does not have integrated graphics card within the processor.
Currently it uses HM55 chipset graphics.

Another warning: The latest 3rd generation Intel Core processors with Intel HD Graphics 2500 supports only GL3.3!
To be more precise, the latest drivers (8.15.10.2761) do not support:

  • GL_ARB_texture_mirrored_repeat (GL1.4)
  • GL_EXT_gpu_shader4 (GL3.0)
  • GL_NV_depth_buffer_float (GL3.0)
  • GL_NV_half_float (GL3.0)
  • GL_EXT_texture_compression_rgtc (GL3.0)
  • GL_EXT_framebuffer_sRGB (GL3.0)
  • GL_ARB_texture_buffer_object (GL3.1)
  • GL_ARB_shading_language_include (GL3.3)
  • GL_ARB_texture_swizzle (GL3.3)

Well, almost GL 3.3. :frowning:

Can you clarify how it does not ā€œsupports multi-sampling!!!ā€?

GL_ARB_shading_language_include (GL3.3)

That is not part of 3.3, or any other core OpenGL version.

The GL spec gives a minimum value for sample buffers of 0, so I think all that is required of the implementation is that it support the multisample interface (GL_MULTISAMPLE, GL_SAMPLES, etc). Does it throw bad enum errors?

Every time I get my hands on a new iGPU from Intel I fire up our app and see whatā€™s amiss. With the latest driver for the HD4000 (i7 3570), I got a crash using glGetActiveUniformsiv() (switched to the older glGetActiveUniformName()), then ran into an issue where it wouldnā€™t attach a single-channel texture using an FBO. After promoting it to RGBA, things were better than previous attempts (with an i5 661 2 years ago), but there were still obvious rendering errors. The situation seems to be improving, albeit slowly.

SAMPLE_BUFFERS and SAMPLES are framebuffer-dependent state, so can certainly be zero.
However, 3.0 and later require MAX_SAMPLES to be at least 4.

Any renderer claiming GL_VERSION = 3.3 needs to support multisampling, along with the core functionality that was promoted from EXT_gpu_shader4, etc etc (even if those extension strings arenā€™t exported.)

  1. GL_ARB_multisample is not in the extension list
  2. There is no pixel format that supports multisampling (I have tried all values for the WGL_SAMPLES_ARB from 16 down to 0)
  3. Neither Qt succeeded to draw anti-aliased lines using OpenGL.

Technically you are right, but this extension is published with GL3.3 and is supported by GL3.3 drivers on NV cards.

malexander, did you succeed to draw ā€œsmoothā€ scene on pre-HD4000 Intelā€™s card?

Another interesting observation: Intel GMA 3150 with 8.14.10.2230 driver supports only OpenGL 1.4! (I would say that it is GL1.5 after all, since only GL_ARB_occlusion_query is missing, but the driver reports 1.4). Notebookcheck claims there are 2 pixel shader units, and none vertex shader units, while Wikipedia claims there are two unified shaders. The truth is GMA 3150 does not support GL shaders at all.

Debug_output is probably supported only on HD4000 series. On the HD2500 it is not. Although the drivers-pack is the same for both HD4000 and HD2500, HD2500 does not support even many feature from the pre-GL3.3 versions.

Iā€™ve found with older Intels that they actually do support the GL_ARB_vertex_program and GL_ARB_fragment_program extensions, so the GMA 3150 most likely does too.

Interesting note re: occlusion queries. Technically this part can support them and report GL1.5 since the spec allows for QUERY_COUNTER_BITS to be 0 - this was actually explicitly added to the spec to enable Intel to report 1.5 support - see OpenGL ARB Meeting Notes - June 10-11, 2003

malexander, did you succeed to draw ā€œsmoothā€ scene on pre-HD4000 Intelā€™s card?

I donā€™t recall it being stable enough at the time to check. Now that CPU is running Ubuntu + Mesa, which does support GL_ARB_multisample. Iā€™d have to do a bit more investigating to see how many samples (if any) it supports. I unfortunately donā€™t have that system dual-booted with Windows, so itā€™d be very difficult to check Intelā€™s driver.

The HD4000 with the latest drivers (Mayā€™12) supports GL_ARB_multisample and multisample textures with 8 depth/color samples, FWIW. Itā€™s been awhile since I used OS framebuffer multisampling, so Iā€™m not certain the GL_SAMPLES value is the same.

I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!
GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
Conclusion: GMA 3150 is moldy.

Iā€™ll repeat tests for the rest of Intelā€™s GPUs. So far only on HD2500+ GL works more or less correctly.

Whatā€™s the point of that?

[QUOTE=Aleksandar;1242808]I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!

GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
Conclusion: GMA 3150 is moldy.[/quote]
That more or less matches with my prior experience on Intels. One could argue that they are ā€œunifiedā€ in so far as whatā€™s running on the hardware is concerned, but that would be stretching credibility a little. On a HD4000 I get 8 samples with 1 quality level, GL3.3 and D3D11, but generally only the HDs are anyway decent-ish; any of the GMA stuff is incredibly basic.

Whatā€™s the point of that?

Precisely.

Overview of Intelā€™s GL support

I have to apologize in advance because of the length of this post, but I had to summarize what I discovered these days.

===========================================
Intel GMA 3150 (Pineview)
Release: Q1ā€™10 - Atom N4xx/N5xx - 3rd generation Intel GPU
Shader model: 3.0(sw)/2.0 (2PU)

OpenGL version: 1.4.0 - Build 8.14.10.2230
Renderer: Intel Pineview Platform
Date: 24.10.2010. (latest)

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 66% - 2/3 ]
OpenGL 2.0: [ 11% - 1/9 ]
OpenGL 2.1: [ 0% - 0/2 ]
OpenGL 3.0: [ 0% - 0/21 ]
OpenGL 3.1: [ 0% - 0/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 23% - 33/143 ]

===========================================
IntelĀ® G41 Express (GMA X4500)
Release: Q3ā€™08 - 4th generation Intel GPU
Shader model: 4.0 (10PU)

OpenGL version: 2.1.0 - Build 8.15.10.1986
Renderer: IntelĀ® G41 Express Chipset

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 23% - 5/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 36% - 51/143 ]

OpenGL version: 2.1.0 - Build 8.15.10.2555
Renderer: IntelĀ® G41 Express Chipset
Date: 19.10.2011. (latest)

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 46% - 66/143 ]

===========================================
Intel GMA HD - i5 540M
Release: Q1ā€™10 - Ironlake - 5th generation Intel GPU
Shader model: 4.0 (6PU)

OpenGL version: 2.1.0 - Build 8.15.10.2021
Renderer: IntelĀ® Graphics Media Accelerator HD
Date: 30.12.2009.

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 23% - 5/21 ]
OpenGL 3.1: [ 28% - 2/7 ]
OpenGL 3.2: [ 0% - 0/9 ]
OpenGL 3.3: [ 0% - 0/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 36% - 51/143 ]

OpenGL version: 2.1.0 - Build 8.15.10.2622
Renderer: IntelĀ® HD Graphics
Date: 21.01.2012. (previously released)

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 55% - 5/9 ]
OpenGL 3.3: [ 10% - 1/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 53% - 76/143 ]

OpenGL version: 2.1.0 - Build 8.15.10.2827
Renderer: IntelĀ® HD Graphics
Date: 10.08.2012. (latest)

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 88% - 8/9 ] NO Multisampling!
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 55% - 5/9 ]
OpenGL 3.3: [ 10% - 1/10 ]
OpenGL 4.0: [ 0% - 0/13 ]
OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 53% - 76/143 ]

===========================================
Intel HD 2500
Release: Q1ā€™12 - Ivy Bridge - 7th generation Intel GPU
Shader model: 5.0 (6PU)

OpenGL version: 3.3.0 - Build 8.15.10.2761
Renderer: IntelĀ® HD Graphics
Date: 24.05.2012. (latest)

OpenGL 1.2: [ 100% - 8/8 ]
OpenGL 1.3: [ 100% - 9/9 ] Multisampling!!! (finally)
OpenGL 1.4: [ 93% - 14/15 ]
OpenGL 1.5: [ 100% - 3/3 ]
OpenGL 2.0: [ 100% - 9/9 ]
OpenGL 2.1: [ 100% - 2/2 ]
OpenGL 3.0: [ 95% - 20/21 ]
OpenGL 3.1: [ 85% - 6/7 ]
OpenGL 3.2: [ 100% - 9/9 ]
OpenGL 3.3: [ 80% - 8/10 ]
OpenGL 4.0: [ 23% - 3/13 ]

OpenGL 4.1: [ 0% - 0/11 ]
OpenGL 4.2: [ 0% - 0/11 ]
OpenGL 4.3: [ 0% - 0/26 ]
Spec score: [ 64% - 91/143 ]

Conclusions:

  • GMA3150 is worse than GF2! It is a luxury of Intel to release a 3rd generation GPU after introduction of its 5th generation GPUs.
  • Intel still supports 4th generation GPUs. Three years after the first release, new drivers bring 8 new extensions.
  • Although Wikipedia claims HD 2500/4000 supports GL4.0 since 2729, it is not what we have. I need a confirmation for HD 4000.

Can anybody post statistics for HD 4000 (Ivy Bridge) and/or HD 2000/3000 (Sandy Bridge)?

Did you look at what extensions were missing from the 3.0/3.1/3.3 cases for the HD2500?

For a HD 4000, driver version 8.15.10.2618, using GL Extensions Viewer 4.0.8 I get:

v1.1 (100 % - 7/7)
v1.2 (100 % - 8/8)
v1.3 (100 % - 9/9)
v1.4 (100 % - 15/15)
v1.5 (100 % - 3/3)
v2.0 (100 % - 10/10)
v2.1 (100 % - 3/3)
v3.0 (100 % - 23/23)
v3.1 (100 % - 8/8)
v3.2 (100 % - 10/10)
v3.3 (100 % - 10/10)
v4.0 (21 % - 3/14)
v4.1 (0 % - 0/7)
v4.2 (0 % - 0/12)
v4.3 (0 % - 0/18)

Multisampling is present and correct, with up to 8 max samples.

The 3 extensions from 4.0, if anyone is interested, were GL_ARB_draw_buffers_blend, GL_ARB_texture_buffer_object_rgb32 and GL_ARB_texture_query_lod.

Thank you, mhagain! Could you report the version of your driver?

It seems neither HD supports GL 4.0, although Geeks3D reported something quite differently.
HD 2500 and 4000 should have the same architecture, and I couldnā€™t find traces of tessellation shaders
The results on HD 2500 and HD 4000 are identical. Is it possible that only 2696 has GL 4.0 support that is removed from the following revisions?

Of course! But I have to revise my extension viewer considering GL 3.0. Itā€™s an old code rely on the first implementation in NV drivers. Thatā€™s why I changed some numbers in the previous post. Thank you for the question! It makes me check ā€œmy understandingā€ of GL versions. Iā€™ll revise it completely.

Considering GL 3.1, GL_ARB_texture_buffer_object is not supported in 8.15.10.2761 drivers, and GL_ARB_shading_language_include, GL_ARB_texture_swizzle from GL 3.3.

Should GL_EXT_gpu_shader4 be reported in GL 3.0?

It seems that you are not using latest drivers. Ivy Bridge has complete support for OpenGL 4.0.

Post edited.

It seems that Iā€™m not using Windows 8. :slight_smile:

Iā€™ve got the point! Take a look at drivers versions:
First GL 4.0 support at 9.17.10.2729
Another test is done with 9.17.10.2792

Those are not Win7 drivers. Thatā€™s the point. The mystery is solved.
OpenGL 4.0 is supported on Win8, but not on Win7!

Although Iā€™m a little bit disappointed by this discovery, it proves the future is bright for the OpenGL on Win8. :slight_smile:
Despite all concerns Windows is and probably will be the best development platform for the OpenGL.

P.S. Thanks mhagain! You could update your driver to take a look whether there is any new extension, since it is not the latest one. And even if there are no updates, some bugs are probably fixed. :wink:

Updated to the latest (8.15.10.2761) and still the same functionality and extensions, but a wee bit faster. :slight_smile:

[b]Aleksandar

[/b]Those drivers can also be installed on Windows 7. OpenGL 4.0 is fully supported on Win7 and Win8.

I will try it tomorrow. I remember that I had some problems when I tried to install some of those drivers, but Iā€™ll try again. Last week I was a witness that Intel graphics driver cannot install at first, but it succeeds after some of the Windows updates. Iā€™ll report the results here. Thanks!

GL_ARB_texture_swizzle and GL_ARB_texture_buffer_object are supported in the 8.15.10.2696 drivers, though they arenā€™t listed in the extension string. I became a little suspicious after the latest driver for my AMD FirePro card also didnā€™t list GL_ARB_texture_swizzle but claimed GL4.2 support and properly supported it, so I decided to recheck the HD4000. The shader with samplerBuffer compiled without incident, and the glTexBuffer API entry point was found. Swizzling also works properly ( glTexParameter(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA) ).

However, the noperspective keyword throws a syntax error in a shader in an interface block, and one of my #defineā€™s appears to be broken with the paste operator (##). Even with these bugs, the situation is certainly better than any Intel GL driver I recall.