PDA

View Full Version : Shouldn't OpenGL set new features faster?



coco
01-23-2001, 04:38 PM
Shouldn't OpenGL grow new features faster? I mean, the only good point I find about DirectX8 is the new programable pixel and vertex shaders. I feel they are going to rock.
Yes, I know that hardware vendors can expose such features (and more) through extensions but: Shouldn't OpenGL be introducing new features to push a bit hardware vendors rather than playing catch-up?
I also know that the imaging subset of opengl 1.2 is great and inovative, but I think developers want other inovations such as skinning, per-pixel shading (not nvidia's extensions only, but a requirement), better hardware curved surfaces, volumetric fog, etc, etc, ....

mcraighead
01-23-2001, 07:04 PM
The first problem is that you are using OpenGL as the subject of a sentence. OpenGL is not a sentient being, and therefore it by itself can't add any features. Instead, you should looking at the people who actually make OpenGL and DX standards: the ARB and Microsoft.

Let's just say that Microsoft, for all its stodginess, is a lot more nimble than the ARB.

The ARB is also at a disadvantage for political reasons. MS can talk to all the IHVs and pick features for DX without worrying about IP issues. But at ARB meetings, everyone is paranoid about IP issues. What do you have to gain by talking about your great new feature, when that just tips off your competitors?

Me? I'm quite happy with the extensions process the way it works now.

- Matt

coco
01-23-2001, 08:16 PM
I understand.
I admit that its remarkable that OpenGL with it's extension mecanism is still a top-notch API. In fact, once you got the extensions' function pointers, enums, etc (which is pretty simple), the extended opengl works seamlessly.
What worries me is that OpenGL works in a lot of plataforms, and each one has its own extensions, so developers have to code almost as many special cases as diferent cards he targets.
A quick example whould be the per-pixel shading extensions: ATi implementation is diferent from NVidia's, and the rest dont even have it, so you end up coding the ATi version, the nVidia version, and the "rest of the world" version, probably uglier or more complicated (multipass, etc).
Is there a way to set a standard set of extensions required for next gen cards, like what OpenGL 1.2 did? this way, for a card to be OpenGL 1.3 (je) compliant it will be required to implement that feature set (say pixel shaders, vertex shaders, etc).

werasfdasdas
01-30-2001, 03:34 PM
You forget, MicroShafted are on the ARB. Nuff said... (no, I am not going to go into one of my conspiracy theories.. no... NOOOO...)