Current GL 3 .0

What’s the current status of OpenGL 3.o drivers and how users accepting it in terms of performance and an alternative to D3D 10 - 11?

When is OpenGL 3.1 spec expected?

Any promising future?

To my knowledge GL 3.0 drivers are available from both NVidia and ATI. I haven’t used/tested the ATI drivers yet, but it seems as if they have support for PC/Mac/Linux.

Who knows when Intel will be on board?

lol

Intel doesn’t even support vertex shader … They reach GLSL fragment shader few months ago.

Intel? I sooooooo don’t care!!!

Intel on OpenGL 3.0

  • Intel is excited about OpenGL 3.0
  • We look forward to a strong OpenGL future
  • Look for Intel support of OpenGL 3.0 on future platforms

This according to the vendor announcements at the Siggraph BOF.

Yes existed … well nothing is existing when I’m thinking of Intel graphics chips.

Until you find yourself stuck with supporting an OpenGL app on intel hardware. You can imagine how fun that is not.

And the sad thing is that intel has about 1/3rd of the GPU market.

I’m tired of people spreading misinformation about Intel’s graphics chips. They may not be the fastest, but that’s no reason to claim they don’t support features, which they clearly support. However it seems to me that the hardware could support even more than what’s exposend in the drivers (geometry shaders, RGBA16 buffer objects, etc).

Here’s part of the glxinfo output on the system on which I’m writing this. As you can see, GL_ARB_vertex_shader is in the extension list even though I have an old Intel 965 graphics chip.

OpenGL vendor string: Tungsten Graphics, Inc
OpenGL renderer string: Mesa DRI Intel® 965GM GEM 20090114
OpenGL version string: 2.1 Mesa 7.3
OpenGL shading language version string: 1.10
OpenGL extensions:
GL_ARB_depth_texture, GL_ARB_draw_buffers, GL_ARB_fragment_program,
GL_ARB_fragment_program_shadow, GL_ARB_fragment_shader,
GL_ARB_multisample, GL_ARB_multitexture, GL_ARB_occlusion_query,
GL_ARB_pixel_buffer_object, GL_ARB_point_parameters, GL_ARB_point_sprite,
GL_ARB_shader_objects, GL_ARB_shading_language_100, GL_ARB_shadow,
GL_ARB_texture_border_clamp, GL_ARB_texture_compression,
GL_ARB_texture_cube_map, GL_ARB_texture_env_add,
GL_ARB_texture_env_combine, GL_ARB_texture_env_crossbar,
GL_ARB_texture_env_dot3, GL_ARB_texture_mirrored_repeat,
GL_ARB_texture_non_power_of_two, GL_ARB_texture_rectangle,
GL_ARB_transpose_matrix, GL_ARB_vertex_buffer_object,
GL_ARB_vertex_program, GL_ARB_vertex_shader, GL_ARB_window_pos,

This is direct information from Keith Packard at Fosdem 2009 conference, 1 week ago.

The extension string of GL_ARB_vertex_buffer_object is present since years ago but last time I tried to use it (1 year and half) I had a null pointer for glGenBuffer …

I unfortunalty had to support this chip for the project I was working on at the time, after a lot of issue I choose to use an OpenGL 1.1 code pass for any Intel chips.

So yes I claim it: They are close to stone edge in term of drivers. The hardware is fine (expect some terrible design issues (have a look on Fosdem X.org presentations for more information)) and yes is support even Geometry Shader with Direct3D. It’s a Direct3D 10 chip!

The issue is the software! Keith Packard have a really good understanding of this issue but at Intel, it seams that a policy is “if we could do it in software, it’s not a big deal”. Other graphics chip companies choose to do it in hardware.

I more or less trust ATI with their OpenGL drivers. They might be slow to implement feature but it’s going to happen. With Intel … well, it’s so not going to happen and to conference I had was such a demonstration.

By the way, they just complete there support of Framebuffer object!

Intel => I don’t care! => OpenGL 1.1

The best way to support Intel is through D3D. Personally, I beleive that you should use the tool that works best.
As for Pkk post, on Linux things are different. The Intel drivers are corrected and updated by the open source community… in other words, things get done.

My experience has been that intel drivers fail to compile bog-standard GLSL programs (typical vertex shader & per pixel lighting) that run perfectly fine on 7 year old Ati and Nvidia hardware. GL2.1 support - yeah right.

Then you have artifacts when using 8bit alpha textures on their lovely 865 and 915 IGPs (workaround: use a RGBA texture or use another card.)

Intel’s D3D drivers is passable. Their OpenGL support is atrocious. Just take a look at the workaround list in Google Earth, it’s quite enlightening! :slight_smile:

Groovounet’s got this just about right: Intel => GL1.1 codepath => pray it works.

@VMan: sometimes you have to support an existing or a cross-platform codebase, where writing a D3D renderer is not an option…

Intel drivers are top notch at crashing and showing blue screens. don’t see something worth using coming even close to 3.0 on Intel, they should fix/implement 1.5 issues first.

Based on my experience with Intel’s video hw, D3D is the inevitable resort after seeing pink backgrounds, black textures…etc. or use minimal OpenGL 1.1 functionality with worst rendering quality. But if things work fine in D3D this means it’s driver issues, where D3D driver is much easier to implement and requires less burden.

I would say unfortunately, but it’s fortunate for our project, that we have to port our OpenGL stuff to D3D, in order to cover consumer level hardware, especially that comes with embedded video boards.

Hate to say it but GL is becoming more like programming punch-cards, being designed for the X Window stuff, 1 century old technology.

It’s not the specification, it’s the implementation.

very true. D3D us supported in the first place, observed many times on Intel and ATI. programming in D3D isn’t much of an issue on Linux either since Wine supports alot these days (OpenGL though). confused myself what to support more, alternatively 2 API’s but one is always somehow dominating in code design. AFAIK D3D is used far more especially in the games market making it the first choice to support and maintain quality drivers.

OpenGL and only OpenGL! :smiley:

quiet on these boards these days…rob barris noticeable by his absence.
shame, it was fun while it lasted - great memories of the emergence of programmable shading that came to GL first through extensions. Special times.
Now it’s just the mac/linux 3d api. We’re even moving to d3d now stereo’s supported in nvidia’s driver. Never thought I’d see the day. Maybe one day we’ll do a linux version of some of our software, and I can relive my youth.
The ARB is beyond contempt.

Is the stereo NVIDIA D3D support a generic D3D thing, or via NVAPI and/or driver backdoors?

Hi knackered,

The OpenGL 3.0 specification was released in August 2008. At this point in time (Feb 2009) there are two feature-complete OpenGL 3.0 driver sets available, one from AMD and one from NVIDIA, on Windows and Linux if I am not mistaken.

Concurrently, work on the 3.1 spec began after the 3.0 spec was completed last summer, and has been progressing ever since. The participation level in those working group meetings has been steady and consistent (myself included).

If you have specific features you need improved or added to GL, please be sure to bring them up in the “talk about your applications” thread since we keep a close eye there for ideas that can and will feed into subsequent revisions.

In short. our views of reality may not match 100%.

Yeah, it’s “feature complete” (it does support that new context-creation extension …), but the feature-set is so meaningless and the API itself is so horribly cluttered, that no one cares anymore. Right now i gain nothing from adding GL3 support. That’s really the main problem here.

Jan.

Just out of curiosity, when you say the API is cluttered, are you referring to features or functions that are part of the 3.0-deprecated set, or something else specifically ?

(Since one of the key motivators behind the definition of the deprecated feature set was to thin out the API’s complexity over time, this is important for me/us to grasp the discontent here)

Plainly as revisions of the API appear that remove clutter (by removing deprecated functions from the API) - you’ll see more new functionality and less clutter.

When you remove all the deprecated stuff the API gets a lot cleaner. The problem is, that

  1. there has nothing been removed directly, so a GL 3.0 driver is still a mess

  2. major vendors already announced, that they won’t remove stuff in the future either

From history we know, that OpenGL is a “soft” or “weak” spec. If some vendor decides, that he doesn’t fully agree with something, he simply will do it the way it’s not actually meant to be played.

I say GL3 is cluttered, because for me it is still 2.1. I can create a GL3 context, but what then? Nothing changes, except for some “guidelines”, but i don’t need an extra context for that.

What we really need, is vendors exposing drivers, where i can set the “forward compatible”-bit and it will actually remove all the deprecated stuff. As a consequence the driver should also become more reliable, hopefully.

Feature-wise even 2.1 is not up to D3D10. Yes, if you take all the NV extensions, it mostly is, but that doesn’t really count. what’s missing in core gl is stuff like custom resolve for multisampling, mixed FBO formats, conditional render, direct state access (!! for everything!), geometry shaders and some other things. And for GL3’s sake, don’t bother about GL 2.1 with it.

If i really wanted to work with the non-deprecated subset of GL3, i would really need the direct-state-access extension to work for everything, so that might be something to take special care for.

Jan.