Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 10 of 30

Thread: Intel and multisampling

Hybrid View

  1. #1
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,143

    Intel and multisampling

    Well, I knew that Intel had problem with graphics cards and OpenGL, but this is ridiculous.
    Neither G41 Expres nor HM55 supports multi-sampling!!!

    May I remind you that GL_ARB_multisample is a part of GL 1.3 specification?
    The applications look awfully on both adapters. Is there any help?

    I also need a confirmation that Intel i5-540M does not have integrated graphics card within the processor.
    Currently it uses HM55 chipset graphics.

    Another warning: The latest 3rd generation Intel Core processors with Intel HD Graphics 2500 supports only GL3.3!
    To be more precise, the latest drivers (8.15.10.2761) do not support:
    - GL_ARB_texture_mirrored_repeat (GL1.4)
    - GL_EXT_gpu_shader4 (GL3.0)
    - GL_NV_depth_buffer_float (GL3.0)
    - GL_NV_half_float (GL3.0)
    - GL_EXT_texture_compression_rgtc (GL3.0)
    - GL_EXT_framebuffer_sRGB (GL3.0)
    - GL_ARB_texture_buffer_object (GL3.1)
    - GL_ARB_shading_language_include (GL3.3)
    - GL_ARB_texture_swizzle (GL3.3)

    Well, almost GL 3.3.

  2. #2
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    Can you clarify how it does not "supports multi-sampling!!!"?

    GL_ARB_shading_language_include (GL3.3)
    That is not part of 3.3, or any other core OpenGL version.

  3. #3
    Member Regular Contributor malexander's Avatar
    Join Date
    Aug 2009
    Location
    Ontario
    Posts
    316
    The GL spec gives a minimum value for sample buffers of 0, so I think all that is required of the implementation is that it support the multisample interface (GL_MULTISAMPLE, GL_SAMPLES, etc). Does it throw bad enum errors?

    Every time I get my hands on a new iGPU from Intel I fire up our app and see what's amiss. With the latest driver for the HD4000 (i7 3570), I got a crash using glGetActiveUniformsiv() (switched to the older glGetActiveUniformName()), then ran into an issue where it wouldn't attach a single-channel texture using an FBO. After promoting it to RGBA, things were better than previous attempts (with an i5 661 2 years ago), but there were still obvious rendering errors. The situation seems to be improving, albeit slowly.

  4. #4
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    781
    Quote Originally Posted by malexander View Post
    The GL spec gives a minimum value for sample buffers of 0.
    SAMPLE_BUFFERS and SAMPLES are framebuffer-dependent state, so can certainly be zero.
    However, 3.0 and later require MAX_SAMPLES to be at least 4.

    Any renderer claiming GL_VERSION = 3.3 needs to support multisampling, along with the core functionality that was promoted from EXT_gpu_shader4, etc etc (even if those extension strings aren't exported.)

  5. #5
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,143
    Quote Originally Posted by Alfonse Reinheart View Post
    Can you clarify how it does not "supports multi-sampling!!!"?
    1. GL_ARB_multisample is not in the extension list
    2. There is no pixel format that supports multisampling (I have tried all values for the WGL_SAMPLES_ARB from 16 down to 0)
    3. Neither Qt succeeded to draw anti-aliased lines using OpenGL.

    Quote Originally Posted by Alfonse Reinheart View Post
    That is not part of 3.3, or any other core OpenGL version.
    Technically you are right, but this extension is published with GL3.3 and is supported by GL3.3 drivers on NV cards.

    malexander, did you succeed to draw “smooth” scene on pre-HD4000 Intel’s card?

    Another interesting observation: Intel GMA 3150 with 8.14.10.2230 driver supports only OpenGL 1.4! (I would say that it is GL1.5 after all, since only GL_ARB_occlusion_query is missing, but the driver reports 1.4). Notebookcheck claims there are 2 pixel shader units, and none vertex shader units, while Wikipedia claims there are two unified shaders. The truth is GMA 3150 does not support GL shaders at all.

    Debug_output is probably supported only on HD4000 series. On the HD2500 it is not. Although the drivers-pack is the same for both HD4000 and HD2500, HD2500 does not support even many feature from the pre-GL3.3 versions.

  6. #6
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,198
    I've found with older Intels that they actually do support the GL_ARB_vertex_program and GL_ARB_fragment_program extensions, so the GMA 3150 most likely does too.

    Interesting note re: occlusion queries. Technically this part can support them and report GL1.5 since the spec allows for QUERY_COUNTER_BITS to be 0 - this was actually explicitly added to the spec to enable Intel to report 1.5 support - see http://www.opengl.org/archives/about...0.html#oglnext

  7. #7
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,143
    Quote Originally Posted by mhagain View Post
    I've found with older Intels that they actually do support the GL_ARB_vertex_program and GL_ARB_fragment_program extensions, so the GMA 3150 most likely does too.
    I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!
    GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
    Conclusion: GMA 3150 is moldy.

    I'll repeat tests for the rest of Intel's GPUs. So far only on HD2500+ GL works more or less correctly.

    Quote Originally Posted by mhagain View Post
    Interesting note re: occlusion queries. Technically this part can support them and report GL1.5 since the spec allows for QUERY_COUNTER_BITS to be 0 - this was actually explicitly added to the spec to enable Intel to report 1.5 support - see http://www.opengl.org/archives/about...0.html#oglnext
    What’s the point of that?

  8. #8
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,198
    Quote Originally Posted by Aleksandar View Post
    I thought it is up to OpenGL drivers, but D3D also cannot setup multisampling on GMA 3150. Only D3DMULTISAMPLE_NONE!

    GPU-Z reports 2 unified shaders with 9.0c/SM3.0, but D3D HAL reports software vertex processing. So, those are not unified shader units at all.
    Conclusion: GMA 3150 is moldy.
    That more or less matches with my prior experience on Intels. One could argue that they are "unified" in so far as what's running on the hardware is concerned, but that would be stretching credibility a little. On a HD4000 I get 8 samples with 1 quality level, GL3.3 and D3D11, but generally only the HDs are anyway decent-ish; any of the GMA stuff is incredibly basic.

    What’s the point of that?
    Precisely.

  9. #9
    Member Regular Contributor malexander's Avatar
    Join Date
    Aug 2009
    Location
    Ontario
    Posts
    316
    malexander, did you succeed to draw “smooth” scene on pre-HD4000 Intel’s card?
    I don't recall it being stable enough at the time to check. Now that CPU is running Ubuntu + Mesa, which does support GL_ARB_multisample. I'd have to do a bit more investigating to see how many samples (if any) it supports. I unfortunately don't have that system dual-booted with Windows, so it'd be very difficult to check Intel's driver.

    The HD4000 with the latest drivers (May'12) supports GL_ARB_multisample and multisample textures with 8 depth/color samples, FWIW. It's been awhile since I used OS framebuffer multisampling, so I'm not certain the GL_SAMPLES value is the same.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •