"NVIDIA releases OpenGL 3.2 beta drivers"

I’m posting this because I’m pretty sure that some nVidia guys while read it and this is actually a suggestion for the future release!

I’m soooooo but sooo feed up by some nVidia communication bad practices such as this news. I actually thing that I’m event more feed up by all this stupid websites forward this news and it’s quite a shame to read it event on OpenGL.org.

Drivers 190.56 are NOT OpenGL 3.2 drivers. It’s OpenGL 3.1 drivers with some features of OpenGL 3.2. Just like ATI drivers are … We got the same news with OpenGL 3.0 and OpenGL 3.1 releases.

I love so much lot of nVidia stuff, cards, docs, projects, I just suggest nVidia to stop fallacious communication. Sooner or later I believe, it is just going to be bad for the company reputation.

Actually all that is left is some context creation flag related thing and moving ARB extension gemoetry shader support to core.
This is according to http://developer.nvidia.com/object/opengl_3_driver.html

Doesn’t seem like a far fetched OpenGL 3.2? Of course GL_VERSION as OpenGL 3.1 would be more appropriate.

I would also point out that it is a beta driver. That means it may not do everything it promises to do correctly.

GLSL 1.50 is not completely exposed in this driver. Trying to use the new “interface” varyings in particular gives an error that I need to enable the extension GL_NV_gpu_shader5, which is not exposed.

(I understand these are beta drivers, I’m just trying to make the issue known)

I was about to try the new ARB_sync.
To my surprise this extension is not supported :frowning:
The driver is far from OpenGL 3.2 spec.

(win32/xp/GF8800)

Hmm, beta

Hmm, alpha?

Most features have been approved the 3th of July and the 24th of july … could we actually expect a full implementation that soon?

No and that’s fine, development take time, having features as soon as possible is great and that all we could expect.

I hope pointing out this will change the balance in the debate of drivers with ATI suposed to be 6 months late over nVidia … Well that not the case even if I agree nVidia is more advanced on that topic.

ARB_sync is part of core OpenGL 3.2. Therefore, you do not need to look for the extension. The entry points are there.

Regards,
Barthold
(With my NVIDIA hat on)

I am sorry. Shame on me. I was mistaken by OpenGL Extension Viewer that reports OpenGL version 3.1.0 and no ARB_sync extension in the list. Therefore I gave up too early.

I wrote small test app. The GL_VERSION actually returns 3.2.0 and all ARB_sync API is there. My test shows that the ARB sync works fine! So I can start coding.

Thanks!

/marek

Current 3.2 beta drivers have an issue that a 3.2 context (without FORWARD_COMPATIBLE_BIT) does not honor compatible behavior, whereas 3.1 and 3.0 contexts do. Hopefully an easy fix-up.

Symptom is things like glGetString( GL_EXTENSIONS ) returning NULL, GL errors being thrown. Flipping back to a 3.1 context remedies that.

(NVidia 190.18.03 drivers)

Are you creating a Core profile or Compatibility profile? glGetString(GL_EXTENSIONS) is deprecated, and hence not available in the Core profile.

Thanks!
Barthold
(with my NVIDIA hat on)

Talking about drivers, here’s an interesting link:

http://www.mcadforums.com/forums/files/autodesk_inventor_opengl_to_directx_evolution.pdf

In a nutshell, OpenGL is better be a software renderer only.

Sorry, I think I’m at fault here for not understanding something.

I confess I didn’t realize the distinction between compatibility and forward-compatibile. I assumed that by not specifying forward-compatible that I got backward compatibility (as was the case <= GL 3.1).

Now I think I see the distinction:

  • PROFILE_MASK = COMPATIBILITY -> Backward compatibility included
  • PROFILE_MASK = CORE -> Backward compatibility killed
  • FLAGS = 0 -> Deprecated but still supported APIs included
  • FLAGS = FORWARD_COMPAT -> Deprecated but still supported APIs killed

So for the most lax profile (include all the old stuff), you want PROFILE_MASK = COMPATIBILITY and no FLAGS = FORWARD_COMPAT. And for the most strict profile, you want PROFILE_MASK = CORE and FLAGS = FORWARD_COMPAT.

(updated) I was using the default options of PROFILE_MASK = CORE and no FLAGS = FORWARD_COMPAT, which kills backward compatibility.

Very insightful link even if I wonder why it is posted here.

“We use OpenGL SW (GDI Generic) in all our QA because at least we get the same consistent result”.

Back to my previous job we were also using the OpenGL Software drivers so be sure that at least it run … well so slowly that it could not be used …

I would say that all these releases OpenGL spec must be seen as a nightmare.

Autodesk is building long term softwares based on a really old code (probably older than 15 years for some) which must be quite horrible. That was exactly this case at my previous job so going into new features even shader was so so so long an painful.

What such software need: stability. They are not going to update their “rendering engine (mess)” for each release.

However, I believe that the main fault is on their side like it was for my previous company. “OpenGL is just an API” “Out software goal is not OpenGL / Direct3D rendering, it’s raytracing, modeling tools”. So basically, if I had a look on the code I would expect to see no OpenGL engine at all, OpenGL code everywhere in 20 000 000 lines of code.

Well, I exaggerate a bit but that the basic idea. Very complicated code to maintain leads to nothing good. I’m using 3DS Max 2010 time to time, I quite like it, but the OpenGL renderer is not a stable option, the Direct3D renderer is “ok” but doesn’t look good, the few shader effects look like hack in the code, the overall rendering is freaky slow.

Finally, I think that OpenGL is valid option for these softwares for compatibility using the fixed pipeline. I agree that Direct3D 10 would be a great platform for advanced rendering because the constrain a really strict.

I totally agree. But it seems that the new GL spec’s did not help much in driver quality and stability. And one thing drew my attention in that paper, is that GL lacks of QA. It’s the job of the API users to make sure things work, and they are left with a big Q whether the bug is on their side or the driver’s.

I’m not here criticizing the API itself, nor the IHVs who implement them. Some party should take charge of a solid GL SDK, and leave the IHVs with minimal driver implementation…like in D3D :smiley:

I’d rather see Khronos investing time/money in development of conformance test suite for OpenGL.

I am always nervous when installing new drivers. Usually something gets broken with new drivers. (using NV)
We have a list of HW/drivers that works well. When user calls to support line, the first question we ask is the HW/drv.

Conformance test! Excellent!

Excellent idea indeed. It’s just, that that idea has been around for ages, but the ARB has not the resources to implement such a thing. That is why we don’t have a conformance test now and i am pretty sure, we won’t get one in the future.

It’s sad, but the fact is, that D3D IS a better option, if you can afford to only support Windows (Vista).

Jan.

Ok, thanks to your tip, I see now I was trying to create a “core” context (since that is the default for glXCreateContextAttribs for a 3.2 context.

For now, I want a compatibility context, which I gather I would normally create (per-spec) via:


      static int Context_attribs[] =
      {
        GLX_CONTEXT_MAJOR_VERSION_ARB, 3,
        GLX_CONTEXT_MINOR_VERSION_ARB, 2,
        GLX_CONTEXT_PROFILE_MASK_ARB, GLX_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB,
        None
      };
      Context = glXCreateContextAttribsARB( getXDisplay(), FBConfig, 0, 
                                            True, Context_attribs );

but per the NVidia driver release page, I need to use the old glXCreateContext() call for now since glXCreateContextAttribs doesn’t yet support PROFILE_MASK (attempting to do so generates a BadValue X error from GLX).

Thanks for the tip.

Specification writing doesn’t fix bugs, only the IHV-developer feedback loop and IHV-invested engineering time can do that. Forum posts about bugs don’t count.