The end to 12 years of being an OGL developer

I just realized that a huge portion of my graphics code has been marked as “deprecated” by the OGL 3 spec.

For me, this is the time for moving to D3D. It’s sad to say this, because I began developing graphics apps with Silicon Graphics, 12 years ago. I loved those machines… everything was superbly engineered, both hardware and software. But that hardware died years ago, and the software remained… until today, when it definitely dies too.

It was a nice ride. Time for Microsoft now.

Bye bye.

FWIW, I also started yesterday to mark the hopefully-to-be-deprecated sections of my GL include headers.

On the contrary, I’m glad to be able to cut through the enormous amount of never used (for me) overhead and complexity ; and I look forward a widely supported OpenGL 3+ with the deprecated features officialy removed !

Since Direct3D also treats a lot of these features as deprecated, wouldn’t it be better to simply adapt to the new way of doing things in openGL?
Oh,well,for me it has never been a problem since I have always checked the nVidia-ATI sites for documentation on the ‘modern way’ of doing things.

Yes, that really sounds like a good idea. Move from an API where some features are deprecated to an API where the same features are already removed.

Good luck with that :wink:

I don’t see how this is a “suggestion” or why Direct3D (that have removed all the features you talk about already) should be better. But do it your way. It is not like pre-3.0 GL is going anywhere, the support will be there for years and years, it will just be frozen. This way, your post does not make any sence: if you have old code and just have to maintain it/make minor improements, use pre-3.0 GL. If you want to move to modern GPU programming, using shaders and so on, you HAVE to rewrite most of your rendering code, be it pre-3.0, 3.0 or Direct3D. Basically, you are just trolling.

12 years in GL and your very first post on the main OpenGL forum? Or perhaps a 12 year old boy with nothing better to do with his time?

Hmm… maybe I didn’t understand it right. I thought the 3.0 spec said that those features that are deprecated in 3.0 can stop working in future OGL releases. But if pre-3.0 code will run in 4.0, I shut up.

Nope, you didn’t win the prize: I’m a usenet oldie.

Drivers can (and do) provide both GL2 and GL3 contexts.
So you can continue to use GL2 for quite some time.
Read this, especially point 3 and 6 : http://developer.nvidia.com/object/opengl_3_driver.html#faq

sounds like advertisement for Microsoft here. im trying to stay cross API and have OpenGL and Direct3D rendering interfaces in an early stage. but JUST moving to Direct3D isn’t that easy as you might think, hard as hell to port from one to another, so either good luck to you or roll with OpenGL 3 because its related to OpenGL anyways or give up.

Bye bye

Hmm… maybe I didn’t understand it right. I thought the 3.0 spec said that those features that are deprecated in 3.0 can stop working in future OGL releases. But if pre-3.0 code will run in 4.0, I shut up.

[/QUOTE]

Again: the GL2.x support isn’t going to disappear! The future drivers will still ship with GL2.x. What is changing is the programming model for future releases. So the deprecation only plays a role if you want to develop for GL3.0> versions. The old GL will still be here, frozen.

BTW, Direct3D gets a new, not backwards-compatible, version every few years.

If the ARB didn’t publish such backwards compatibility policy, they should publish it, because, for example, I wouldn’t have started this thread if I had read it. I’m aware that NVIDIA doesn’t plan to drop features, but unfortunately it’s not guaranteed that all my users have an NVIDIA card, so this policy has to be enforced by the ARB, not by NVIDIA.

It’s not the job of the ARB to tell vendors what profiles they have to support in their products. But GL2 is just a subset of GL3, because GL3 didn’t remove anything. So vendors would be absolutely stupid if they only support GL3, but not GL2. And this is exactly what nVidia is saying: as long as their hardware supports the GL3 profile, it will also support GL2. They’re just stating the obvious.

In (far?) future, there might be hardware, that will not support GL3.0. This hardware generation will also not support GL2 of course. But your users will have the choice if they want to buy it then.

CatDog

It would be sort of …what is the word I’m looking for … stupid, if your “old” program doesn’t run.
So you think that ATI is not going to offer GL 1.1?
The code is already written and ready to be compiled over and over again for the next 20 years.

If the ARB didn’t publish such backwards compatibility policy, they should publish it, because, for example, I wouldn’t have started this thread if I had read it.

The ARB has published their policy on removing features, and it’s very clear.

Anything that is to be removed will be marked deprecated in a version before it is removed. After it is removed from core, that functionality will be placed in a core extension, which users can query as normal. This is done on a version-to-version basis. That is, a feature may be core in 3.1, deprecated in 3.2, and an extension in 3.3.

this policy has to be enforced by the ARB, not by NVIDIA.

The ARB can’t even enforce conformance to their own spec! What would make you think that they can enforce a guideline?

If the point he’s making is that if an individual vendor has unfair influence over which features are (or aren’t) officially deprecated, then I could probably see it if I squint real hard - but by the same token, what’s sauce for the goose…

I have a good example of guideline they didn’t manage to enforce …

http://www.opengl.org/registry/doc/GLSLExtensionRules.txt

lol, I never see anyone doning this!