nVidia-focused developing in ogl

I have an nVidia card. I want (and supposed to) write code compatible with any (or most) consumer (read: average gamer) cards. These, as far as I see, beside nVidia cards include at least ATi cards, and other companies seem to rise from the slumber and make nVidia competition (e.g. SiS Xabre, Matrox Parhelia). I ain’t John Carmack and I don’t have a cart full of different ATi, nVidia and who’s-else cards to test my code on them. In fact, in OpenGL I don’t have to TEST, but write same effect in different ways and using different extensions for ATi, nVidia, etc cards, then test. Well, and how am I supposed to do that on nVidia? No problem, I can buy Radeon, something else, but it is obviously stupid to switch them in one machine, so I must have a whole park of machines for simple code-wrighting, not testing?

Conclusion: (if you disprove it I will be grateful)
OpenGL is very loosely standardized API, and, though I like OpenGL’s interface much more than D3D’s, it seems that I must switch to D3D. In D3D, everything is standardized, and if my code works on some card, it will work on any card with same hardware capabilities, no matter who’s the vendor. In OpenGL, anyone who designs a new card, can think up a whole bunch of extensions (GL_VASYA_PUPKINS_COOL_FEATURE), do I need them? In real programs? nVidia dominated the market for some years, yes, but now it is not so obvious, and I can’t assume that most of my end-users run nVidia hardware.

Question #1:
Where is standard extensions for shaders? It’s been long time since D3D8 appeared, and it has standard interface…

Question #2:
What is the sense in using OpenGL for real game coding, not some tech-demos?

This problem of standardization is much discussed in most of the latest looooooong threads…

There is no doubt that GL is behind D3D in terms of standardization but the fact that each vendor can produce his own extension is for me one of the advantages of GL (although it can be a pain with pixel/vertex shaders): you usually get to use the latest features of the latest cards in GL before you can use them in D3D.

One advantage of OpenGL is still that it is multi-platform.

The standard extension for pixel shaders may be included in GL1.5 (GL1.4 should have something standard for vertex programs). GL2.0 is a proposal that offers what you are looking for (although it is far from being ready whatever other people can say).

I don’t quite get your 2nd question since you mentioned John Carmack in the very same post (doesn’t he develop real games with OpenGL ???).

All in all, I am not sure whether switching to D3D will solve all of your problems but if you think so, then why not !

Regards.

Eric

nope, eric, carmack only codes techdemos. others use them then for games…

Lol Perman - that was kinda-funny. hehe.

2Eric:

Well, I must’ve meant real games using shaders

Switching to D3D not only won’t solve my problems, but will add furhter headache, IMHO D3D is over-object-oriented, I mean object-driven interface is of course convenient, but everything has some limit

Despite all present shader limitations, I found them (particularly vertex shaders) to be very efficient way of generating animation. Well, it works OK using NV_VERTEX_PROGRAM_1_1, but it’s NV! So nothing except NV will run it correctly, while Direct3D shaders run fine (even their software emulation is fast enough) on any card.

As for Mr. Carmack’s real games, I read through his notes on doom engine, he says: “Well, this card screwed my console renderer, hence it is not good (GF4MX)… Radeon can use more textures simultaneously than GF, so I rewrote [some effect] to use Radeon’s multitexturing, and it is good”

So, it seems he writes his doom not for opengl, but for different cards’ extensions. He meants some fallbacks for cards not capable of doing something in hardware, like using multi-pass instead of multi-texture for older cards, but it doesn’t matter.

As far as I see, he focuses not on generic hardware capabilities, but on particular implementation of this capabilities by some vendor.

btw, isn’t doom a tech-demo right now?

[This message has been edited by mech (edited 07-08-2002).]

Originally posted by Eric:
One advantage of OpenGL is still that it is multi-platform.

Yes, this is true. There is an opengl implementation on the Gamecube, I’m led to believe. But you can guarantee what extensions are available to you on that platform, unlike the PC or Mac.

>>>Conclusion: (if you disprove it I will be grateful)
OpenGL is very loosely standardized API, and, though I like OpenGL’s interface much more than D3D’s, it seems that I must switch to D3D. In D3D, everything is standardized, and if my code works on some card, it will work on any card with same hardware capabilities, no matter who’s the vendor. In OpenGL, anyone who designs a new card, can think up a whole bunch of extensions (GL_VASYA_PUPKINS_COOL_FEATURE), do I need them? In real programs? nVidia dominated the market for some years, yes, but now it is not so obvious, and I can’t assume that most of my end-users run nVidia hardware.
<<<<

I don’t think OpenGL is loosely standardized. Standard OpenGL is well founded, while D3D was evolving piece by piece. Extensions give vendors the power to offer something special.

Anyway, what’s the point of talking about this. It sounds like you know your tools, and you know what your goals are.

This post belongs in the suggestions forum. Just go and say, “OpenGL sucks because it doesn’t have a standardized pixel + vertex shader”.

V-man

OpenGL does not suck Just guys from ARB are a bit slow

Gamecube has opengl? Thats a new one on me. I think whoever told you that is getting confused with the main API, that is based on GL. It’s practically the same but with the gl prefix changed. Also it’s got alot more stricter things, due to the hardware.

But it isn’t really opengl. Just looks like it.

Apparently, Nintendo chose opengl instead of using its own proprietary API to ensure 3rd party games releases quicker. Remember how N64 didn’t have many games and wasn’t a great seller?

Some job offers I see ask for opengl + gamecube, so it must be true that they are using gl.

PS2 has its own thing still, I think. The XBox is on the wrong side of the street

V-man

D3D is standarized? roflmao

Haha, when we developed our last DirectX interface here for version 7 we had whole 8 testing PCs with all different types of hardware, because D3D was incompatible like hell!

I personally don’t think that Microsoft is responsible it, but that the vendors make drivers which do not follow the specifications. But because of the matter that our D3D expert here in firm screamed loudly when he tried to run the project on a Radeon 8500 for the first time after concentrating for weeks only on the GeForce 3… tells me anyway that even nowdays there is still an incompatibility issue in Direct3D if you use newest hardware features.

To question 2:
What is the sense to use D3D instead? Doesn’t matter if you use Direct3D or OpenGL, you will always have to buy you an ATI card as well if your engine shall have a great FPS at the end and be cutting edge. The cards are far too different that you can say “if i use Direct3D it works fine on every machine”. Well, if you use except hardware T&L not any single extended features this may be right. But also only in this case. Does nVidia offer true form? No, it doesn’t. And does ATI offer all nVidia features? No, it doesn’t too. Does the GeForce 4 offer 8 texture units? Nope. But the Radeon 8500 does. So you will never get around having both cards and also to write speed critical things for a couple of possible cases. On the one card you have to render the object multipass, on the other you can do it one one pass. And so on and so on.

We will support nVidia and ATI and with this nearly 100% of the market is covered anyway. And who buys exotic cards has to await that it will not run at full speed on them. Majorly the 3dfx crap we will totally couple off now. They made the most non-standard drivers ever all time and as the expenses to support theses cards is in absolutely no relation anymore to the possible winning we could get through the people who still use it we simply decided: You have a Voodoo? Buy you a GeForce.

And if you want an ATI for free: Build a great looking demo using the newest nVidia technology, majorly also using all features ATI does not offer. Then read a bunch of ATI feature docu. Go to any game conference like the E3 or ECTS with a notebook rented from any friend. Go to the people representing ATI there, show them your demo and say “Well, I would like to support all of ATI’s new features like … as well, but as I don’t have the money to buy me your new cards…” and so on . We did the same and had just two weeks later got a package with 5 Radeon 8500s.
The big vendors will “help” you to be able to develope for theirs newest cards, if you go the professional way. So you don’t need to worry about this.

I’m also more and more moving to Direct3D. But surely not because of ATI or nVidia, but simply because of the matter that Direct3D is more and more establishing and to be able to later port our products to consoles like the X-box.

BlackJack

like moths to the flame, i luv it
2/ according to the game charts this year, if u wanna get a number 1 3d game in the US it seems opengl is the way to go.
Q/ has there been a single d3d one?

Originally posted by Robbo:

Lol Perman - that was kinda-funny. hehe.

davepermen… its one word with 3 e, and only one a, and the a is in dave, not in permen…

Some job offers I see ask for opengl + gamecube, so it must be true that they are using gl.

Dude, I’m working on Gamecube now, and it is NOT OpenGL, although it looks similar.

Most consoles have always had their own prioprietry API to match the hardware. The only exclusion to this was Xbox with DX.

Nutty

Originally posted by davepermen:
davepermen… its one word with 3 e, and only one a, and the a is in dave, not in permen…

Don’t confuse the issue `veper’.

Originally posted by Nutty:
[b] Dude, I’m working on Gamecube now, and it is NOT OpenGL, although it looks similar.

Most consoles have always had their own prioprietry API to match the hardware. The only exclusion to this was Xbox with DX.

Nutty[/b]

Yes, well I too assumed it used GL because of the job adverts I’ve seen, in mags like Edge and Develop.
So it’s similar - that’s good enough for me.
Any jobs going at your joint, Nutty?

2zed:

1st in charts are the Sims
Most of other - Quake powered games… So it’s not about GL