PDA

View Full Version : What is happening?



Zak McKrakem
03-03-2005, 06:26 AM
David Blythe was the principal engineer with the advanced graphics software group at SGI, one person that has been creating courses about OpenGL, has been a representative on the ARB, and recently one of the fathers of OpenGL ES (and the OpenGL ES Specification Editor). I read last year that David Blythe is now part of the Direct3D developer team. In fact he has given some presentations about Direct3D (http://www.google.com/search?q=%22David+Blythe%22+Microsoft)

But now, I have read that Kurt Akeley, co-founder of SGI, one of the fathers of OpenGL, the person who signed the OpenGL 2.0 specification (with Mark Segal), that has contributed to extensions like frame_buffer_object or vertex_buffer_object, the person who had a Lotus with license plates that read “OpenGL” (http://www.opengl.org/about/arb/notes/minutes_12_94.txt) is now also at Microsoft.

What is happening? Everybody turning to the dark side?

Who will be next? Mark Kilgard? Jon Leech?

How affect this to the future of OpenGL?

Aeluned
03-03-2005, 07:22 AM
money. the root of all evil.

zed
03-03-2005, 08:28 AM
How affect this to the future of OpenGL?one word, PS3.
personally ild be more worried about the future of d3d

dorbie
03-03-2005, 09:59 AM
How affect this to the future of OpenGL?[/QB]Probably scare some people like you and that's about it.

Akeley designed OpenGL 1.0 with Segal, the origins of OpenGL 2.0 are more complex but he contributed with *many* others. AFAIK he's working for Microsoft in China.

The last thing Blythe played a central role in before Microsoft was OpenGL|ES his move to Microsoft may be related to embeded efforts there but he could do a lot of things.

This is old news, the sky is not falling. Asking who's next is kinda silly. Blinn and Kajiya have worked at Microsoft as researchers for years, scores of other people don't (yet :eek: ). :rolleyes:

Mark Kilgard
03-03-2005, 10:38 PM
Originally posted by Zak McKrakem:
Mark Kilgard? OpenGL has a specular, I mean bright, future.

NVIDIA has made a rock-solid commitment to OpenGL and I've been amazingly fortunate to participate in that commitment. NVIDIA's OpenGL driver is the most functional, best performing, and most stable implementation of OpenGL available. Given the kind of sustained commitment NVIDIA has given OpenGL, I'm quite happy and proud to call NVIDIA my employer. What's been accomplished is really a testament to the passion of hundreds of top-notch software engineers, hardware designers, and 3D architects here at NVIDIA. That passion permeates all aspects of NVIDIA's product development.

Wow, I think about what RIVA 128 was seven years ago. No 32-bit color, no 32-bit depth/stencil, no 32-bit RGBA8 textures, everything was 16-bit, no sub-pixel positioning, only a small subset of OpenGL's blend modes supported, the most basic texturing was fancy back then. But RIVA 128 was a great chip for its time with a full OpenGL Installable Client Driver (ICD) for Windows.

Now think about GeForce 6800 today. Vertex programs can access non-power-of-two floating-point textures! Fragment programs can branch on data-dependent values computed within your shader in full 32-bit floating-point! If one 6800 isn't fast enough for you, put two in your SLI system. I've watched it all but still can't help but be impressed.

In those seven years, OpenGL transformed itself from a hardware-amenable graphics state machine (with a fair amount of quirks--think color material, feedback, evaluators) into a first-class platform for programmable graphics. Yet, there's still 100% complete API compatibility going all the way back to OpenGL 1.0, nearly fifteen years. And cross-platform support too! Think about it: While native window system APIs are frustratingly different across different systems, OpenGL rendering code can recompile and run natively and fully hardware accelerated across Windows, Mac, and Linux systems. Fully porting a sophisticated graphics user interface between Mac, Windows, and Linux systems can take several man-years, but OpenGL rendering code just recompiles (possible lesson: render your GUI in OpenGL!). State-of-the-art 3D programmable floating-point shading is more portable than trying to create a scroll bar! Think about this: when it comes to API calls, glBegin and glVertex3f are as ubiquitous today as malloc and strcpy.

And NVDIA provides access to the full GeForce 6 Series 3D feature set through OpenGL (even functionality not exposed in the other 3D API such as hardware accelerated accumulation buffers, border texels, depth clamp, depth bounds test, multisample coverage control, and stencil clear tags).

Also rather than force developers into an OpenGL-centric high-level language, NVIDIA has given you the option to pick among the OpenGL-centric OpenGL Shading Language standard, various assembly representations that expose the FULL underlying programmable hardware functionality, or Cg that allows OpenGL-based content creation applications to produce shader-based 3D content for a high-level shading language that's not tied to OpenGL.

If there was one thing about OpenGL that I've been frustrated by, it is the short-sighted decision to hide programmable shading behind a single high-level hardware shading language that is overly tied into OpenGL to the point that an optimizing compiler is wedged into the driver. Yes, there are ARB-standardized assembly extensions, but NVIDIA is the only vendor exposing the latest GPU functionality in both high-level and assembly forms.

Face it: Shader programs are part-and-parcel of modern 3D content today. To render contemporary 3D content, you need geometry, textures, and... shaders. You wouldn't base your 3D application around an image file format or 3D model format that could render ONLY with OpenGL. Anyone for the OpenGL Image Format or OpenGL Model Format. Instead, you pick API-neutral formats (TGA, JPG, whatever) for content. But shaders written in the OpenGL Shading Language aren't neutral and shackle themselves to OpenGL.

Hey, what's wrong with shackling content to OpenGL if people (in this forum at least) love OpenGL? It's not just being shackled to OpenGL. It's being shakled to a particular weight-class of OpenGL found in PCs today when that weight-class is very likely to be unsuited to exciting future 3D consumer devices.

I'm all for programmer-productive authoring of shaders in high-level shading languages (hey, I even co-authored a book about just that), but do we need to jam a compiler into the driver? I think it was a bad move (even if I did wind up reluctantly implementing it in NVIDIA's OpenGL driver). Adding OpenGL Shading Language support bloated NVIDIA's OpenGL driver by something shy of a megabyte. (Don't be too surprised; that's about the size of any good optimizing compiler implementation these days, plus you gotta thow in the standard library.)

Think about what happens if we add some new language feature to the OpenGL Shading Language. Pick your favorite C++ or Java feature. Say a feature to make shader writing more object-oriented. For example, Cg has a wonderful "interface" construct similar to what Java provides to make shader design more modular and abstract.

So you happily and productively embrace this new language feature. But wait, there's a catch. Anyone wanting to use your GLSL shaders written with the new language feature must download the right new driver and reboot their machine (or maybe even rebuild their kernel for Linux users).

That's a pretty big end-user burden just so you, the programmer, could use a fancy new shading language feature. And if vendor XYZ is late to release a driver with your new favorite new language feature supported, your shader just doesn't work in the meantime.

Direct3D out-software engineered the ARB when it came to engineering a programmable shading language. Direct3D builds its shading language implementation into a redistributable library that you can package with your application. The library targets a (tokenized) assembly interface. So a new language feature (or compiler bug fix) can be utilized without necessitating end-user driver upgrades (and reboots) by just using the latest compiler library.

Cg makes this same wise engineering choice. There have been four Cg releases so far. New language features get added in without much fuss. Plus the language itself is API-neutral so your Cg shader can be used to render with the other API with few or no problems.

Still if you don't like either GLSL or Cg, feel free to target our assembly interfaces that expose NVIDIA's FULL programmable functionality.

I love what Michael McCool and his students at the University of Waterloo have done with their Sh library. Their meta-shading paradigm for shader construction is the kind of novel approach I want to encourage. Having a fully-functional assembly interface facilitates this.

Have I irked anyone? I hope not. If you love the OpenGL Shading Lanuage, hey, NVIDIA supports it quite well, including vertex textures and data-dependent branching. That's stuff no one else hardware accelerates today.

Still if you want other options for programmable shading at the assembly level or with an API-neutral shading language that allows you to easily move your shader assets between OpenGL and the other API, NVIDIA has you covered too. You pick what suites your needs.

Honestly, it's a great time to be in the midst of 3D graphics hardware technology.

I'm confident OpenGL will stay current the state-of-the-art for 3D graphics performance and functionality.

I've got my complaints however. For a few big OpenGL design decisions, I've been unhappy with the outcome. Bluntly, it's been disadvantegous for OpenGL. But you take the good with the bad and win the battles you can. Would I offer advice for an OpenGL programme wanting to know what syntax they should use for writing shaders? At one level, I'd say pick what best meets your needs. You can be confident NVIDIA is going to support whatever choice you make, even if you decide to use the other API. But if you pressed me, I'd say author shaders for OpenGL with something API-neutral so you can reuse your shaders no matter what rendering interface you use. So I'd recommend Cg. Nobody should be surprised by that.

And yes, OpenGL has a specular future.

I hope this helps.

- Mark

SeskaPeel
03-04-2005, 04:15 AM
Mark, it's nice to see that some of you keep in touch with the "lower" community, but yet it's hard to read that such good thoughts are blended with such politics one. Is the aim of this post really to say that OpenGL's gonna shine ?

NVIDIA's dogma 1 : "don't compete with your customers"
eg. don't build boards, better sell the chips

NVIDIA's dogma 2 : "don't compete with Microsoft orelse they will destroy you"
eg. priority is to implement latest DirectX, because 1/ the market asks for this and 2/ it will please MS
eg. never do more that what MS specified, it would be a waste of time & money

Obviously the software engineering team is clearly separated from the marketing one. But which of these two dictates the company's strategy ?

Even a 2 hours writing mail won't make me think that the "good" people at nVidia's can really be strong enough to compete with the "bad" ones. The 2 dogmas came from one of those "bad" guys mouth.

SeskaPeel.

Cab
03-04-2005, 08:25 AM
Mark,

I agree with you about 'your feeling' about OpenGL.

I used Cg but I removed it against GLSL mainly because the resulting code didn't work properly in ATI cards, and because the lack of support: my 'assigned' developer relations at that time finished a two months 'please, wait' emails with one 'we are too busy at the moment, I will try to answer you in a near future' (a future that never came). It was a problem of invariance not working on GFFX cards.

Seeing all the problems with GLSL on non-NVIDIA boards, I agree with you that maybe an intermediate pseudo-assembly could be good. But some problems arise: who will be the implementer of the intermediate compiler? The ARB? How many time will then be necessary to have the first version (that will not make happy to any of the ARB members, including NVIDIA)?
In the case of D3D it is clear that it will MS. But in OpenGL…

How many time will it take for every new version of the assembly language extensions (arb_fragment_program2/arb_vertex_program2)? And, will them be really useful or will them become a least common denominator between ATI, NVIDIA and 3DLabs?
For example, Will it include instructions like NRM? Or should we imagine that the ‘unified compiler’ will identify the group of instructions and will convert then to a NRM instruction?
For instance, there are similar problems (or even more) using the assembly extensions in ATI cards than using GLSL. You have to take care about instruction order, about texture indirections, …

It is good to hear from you. I would like to see a conference/tutorial from you at GDC, like some years ago. I think you are a good communicator.

daveperman
03-04-2005, 11:30 AM
Originally posted by Zak McKrakem:
Who will be next? Mark Kilgard? Jon Leech?or me? please?

Zak McKrakem
03-07-2005, 07:08 PM
Originally posted by Mark Kilgard:

NVIDIA has made a rock-solid commitment to OpenGL and I've been amazingly fortunate to participate in that commitment. NVIDIA's OpenGL driver is the most functional, best performing, and most stable implementation of OpenGL available. Given the kind of sustained commitment NVIDIA has given OpenGL, I'm quite happy and proud to call NVIDIA my employer. What's been accomplished is really a testament to the passion of hundreds of top-notch software engineers, hardware designers, and 3D architects here at NVIDIA. That passion permeates all aspects of NVIDIA's product development.
...
Well, it is 'true' for driver development (it has never take so long for NVIDIA as the glsl implementation, even ATI had it before. And what about 2.0, NVIDIA has no currently officially release a OpenGL 2.0 driver, nor in the consumer space neither in the developer space). Can you imagine this situation if DX 10.0 will be released today...? Tomorrow NVIDIA would announce that it has a DX10 driver.

And what about FXComposer just being D3D? Even ATI has Rendermonkey with GLSL support and working pretty well.
And what about NVPerfHUD? It was 'announced' in version 1.0 that it will support OpenGL. Now it is version 3.0 and no sign of OpenGL.

I agree that NVIDIA has the best OpenGL support in your drivers. And it is indeed an advantage for your company. What do you think about every developer, except Humus that is currently working for ATI ;) in these forums recommended the GF6800 family? What do you think in every OpenGL released game recommending a NVIDIA card over your competitors?
Every good work has its reward. But I think that it has been your personal work as the NVIDIA OpenGL driver ‘boss’.
As said here, I think the not only your consumer marketing people is committed to D3D but your developer marketing people. This is my opinion.

Korval
03-07-2005, 07:54 PM
NVIDIA's dogma 2 : "don't compete with Microsoft orelse they will destroy you"
eg. priority is to implement latest DirectX, because 1/ the market asks for this and 2/ it will please MS
eg. never do more that what MS specified, it would be a waste of time & moneyI hate to point this out, but the later is patently untrue. nVidia exposed register combiners on the GeForce 1. Not through D3D, which at the time didn't even consider such a thing to be possible, but through OpenGL. Even after D3D 8, register combiners were more capable than what was exposed through PS 1.0 and 1.1. The GeForce FX, for all its shortcomings, blew past the limits of PS2.0 and VS2.0, in terms of number of avaiable constants and uniforms. Microsoft had to commission a new release of D3D 9 just to expose features that only nVidia to this day supports in the GeForce 6 line.

So nVidia is hardly Microsoft's pawn. Indeed, if anything, it's ATi who implements D3D specifications directly into their hardware, not nVidia. After all, with no new version of D3D, did ATi even consider adding new features to the R420 line?

SeskaPeel
03-11-2005, 07:19 AM
@Zak:

"What do you think about every developer, except Humus that is currently working for ATI in these forums recommended the GF6800 family?" Their dev rel team is doing excellent work. Excellent because technically efficient, free, nice people to chat with, and fast answering.
This is no surprise that a lot of the community appreciate this.

What do you think in every OpenGL released game recommending a NVIDIA card over your competitors? There is two points : what opengl game are you talking about ? Doom 3 or q3 and derivatives ... marketing ?
Second point : the way it's meant to be played ... marketing ?

@Korval:
The rationale behind those "new features" is simple. To build a fully compliant DirectX 9 chip, they need a pool of solid features. Once they have it, other added can come for free as they were part of the pool.

Antoher example: at this point, there's no difficulty in opening a programmable blending stage. Why is it not done yet ?
Again, D3D is based for PC architectures, and nVidia is clearly trying to take over all the chipset market, being graphical or not, PC or not. Having such constraints (D3D specs) is a pain for them, they'd better go for a proprietary API, that would be even easier to port on console market. But this gets in conflict with dogma #2, and Microsoft will consider them as enemies. I totally trust the "scientist" that told me they'd never do it in a near future, meaning it would certainly never happen.

Everything that is not opensourced, is Microsoft's pawn ... Can you picture a rock solid commercialproof 3D engine opensourced nowadays ?

Anyway, http://games.slashdot.org/article.pl?sid=05/03/10/214212

SeskaPeel.

SirKnight
03-11-2005, 08:43 AM
And what about FXComposer just being D3D? Even ATI has Rendermonkey with GLSL support and working pretty well.
FX Composer 3.0 is currently in the works which is a complete re-write from C++/MFC to C# and will feature both Direct3D and OpenGL among many other new cool features.

glitch
03-11-2005, 08:55 AM
Originally posted by SirKnight:
FX Composer 3.0 is currently in the works which is a complete re-write from C++/MFC to C# and will feature both Direct3D and OpenGL among many other new cool features.I think you mean FX Composer 2.0 ...

SirKnight
03-11-2005, 02:10 PM
Oops, yes I do. Hit the wrong key and didn't realize it. :eek:

-SirKnight

marco_dup1
03-11-2005, 05:24 PM
Yes, Linux support is good but what about the SDK, all the tools etc.? This is really lacking.

Korval
03-11-2005, 05:37 PM
Their dev rel team is doing excellent work. Excellent because technically efficient, free, nice people to chat with, and fast answering.
This is no surprise that a lot of the community appreciate this.I'm sure that's one reason. But what of the others? Like actually caring about the quality of their drivers (ATi releases a beta driver every month, while nVidia takes their time to actually test and fix bugs). Or maybe it's because they tend to expose more features through GL than ATi does.


The rationale behind those "new features" is simple. To build a fully compliant DirectX 9 chip, they need a pool of solid features. Once they have it, other added can come for free as they were part of the pool.Nonsense. Even the FX was far more than DX9 compliant. It blew past the DX9 requirements, instruction and uniform-count-wise, while ATi implemented the bare minimum.

There is not one feature in either the R300 or the R420 that is not the bare minimum that DX requires. You can walk the list of features for the card and for DX9, and you can see that it does exactly and only what DX9 requires. Meanwhile, every nVidia card has offerred more through OpenGL than D3D.

No nVidia or ATi card has gone against DX. This is a fact, and it is expected, since DirectX is the predominant 3D gaming API. However, no nVidia card has ever just done the bare minimum either. They have always provided more hardware than DX requires, from the TNT2 through the 6800. ATi has not.

Your Dogma #2 is followed far more by ATi than nVidia. And I defy you to provide one, one counter-example where an nVidia card provided only the bare minimum DX functionality. Past the TNT2 model.

Oh, let's not forget which card it is that forced 2 API's to define that texture-indirection nonsense into their fragment program specifications.


at this point, there's no difficulty in opening a programmable blending stage. Why is it not done yet ?Acceptance of this statment involves the answer to this question: by "programmable blending state" do you mean a 3rd kind of program that runs after the fragment program, or being able to read the framebuffer in the fragment program? If it is the latter, I refuse to accept that there is "no difficulty" when scores of engineers are telling me otherwise.

If you mean the former, what are you suggesting? That Microsoft decides when a programmable region opens up? Last time I checked, my first fragment programs were NV_RC, a good year or two before we first got D3D shaders. Microsoft didn't ask nVidia to make RC's; nVidia did it on their own.

More importantly, God knows I don't see ATi stepping up to provide programmable blend stages either. So, while you have an argument about Microsoft's decisions about DirectX's API controlling what gets built into graphics cards to a degree, it is most definately not limitted to nVidia.

Actually, this supposition is the only thing that I can find that provides any logic to your prior arguments. It seems that perhaps you are suggesting that nVidia, as a dominant force in the graphics industry, should challenge Microsoft more often and is abbrogating their responsibilites to the consumer by not ignoring the Direct3D API? That they should have created a propriatery API that others now implement that makes D3D and OpenGL worthless (or that they just focus on GL without respect to D3D)? That such a thing would be benifitial to the consumer at large and that nVidia has chosen not to do so because it would anger Microsoft?

Well, you're right to some degree. nVidia choose not to commit seppuku by ignoring the advances of D3D, just as ATi made the same choice. 3DFx did, and look where they are now. With ATi on their heels, nVidia is in no position to annoy legions of D3D developers and create 3 competing API's.

More importantly, who cares? If programmable blending is an important feature, they'll implement it sooner or later. Either D3D will expose it or they will through the DX10 extension mechanism. So you don't get programmable blending at the first moment you could have had it. Boo hoo. We'll get it soon enough.

On a personal note, programmable blending isn't that interesting to me. Useful? Certainly. But I'd prefer having a programmable primitive processor that can walk memory and feed attributes to the vertex shader. And I think that's where we'll go before getting programmable blending.


Everything that is not opensourced, is Microsoft's pawn ... Can you picture a rock solid commercialproof 3D engine opensourced nowadays ?Huh? You mean like Torque?

V-man
03-12-2005, 04:29 AM
Again, D3D is based for PC architectures, and nVidia is clearly trying to take over all the chipset market, being graphical or not, PC or not.What's your point?
ATI has created chipsets for all sorts of platforms just like NV.


Having such constraints (D3D specs) is a pain for them, they'd better go for a proprietary API, that would be even easier to port on console market. But this gets in conflict with dogma #2, and Microsoft will consider them as enemies. I totally trust the "scientist" that told me they'd never do it in a near future, meaning it would certainly never happen.I see Nvidia as an innovator in the graphics industry since it's beginnings. However, someone said that the concept of RCs originates from SGI and Nvidia licensed it. Even if that is the case, they recognized it and implemented and released nice demoes for GL.

I don't agree with your dogma #2 at all. Some NV features are not exposed in D3D.

What was this conversation you had with the scientist?

plasmonster
03-12-2005, 10:01 AM
Hi Mark,

I'd like to thank you for all of your generous contributions to the graphics community. And thanks to everyone at NVIDIA for their steadfast support of OpenGL and continuing contributions to this forum.


Fully porting a sophisticated graphics user interface between Mac, Windows, and Linux systems can take several man-years, but OpenGL rendering code just recompiles (possible lesson: render your GUI in OpenGL!). State-of-the-art 3D programmable floating-point shading is more portable than trying to create a scroll bar!I couldn't agree more with this sentiment. Inspired by Blender and Eric Lengyel's C4 engine, I've begun work on an in-game editor and have not looked back. What a profound relief it is to be done with Windows, insofar as the editor goes, anyway. And to know that not only will my game be portable, but the editor as well? Well, it's really quite a thrill. I too believe this is one of OpenGL's greatest strenghts, and could well play a role in its eventual rise and domination in the PC games market :)

zed
03-12-2005, 06:39 PM
How affect this to the future of OpenGL?apendum(sp?)

nice to finally read someone from sony has confirmed that the ps3 will use a dirivative of opengl

"Cell graphics will rely on a variation of the standard OpenGL library already widely used for PC games. Sony and software consortium the Khronos Group are developing Open GL/ES, a dialect of OpenGL optimized for interactive content"

http://news.com.com/PlayStation+3+to+be+...html?tag=cd.top (http://news.com.com/PlayStation+3+to+be+easy+on+developers%2C+Sony+vow s/2100-1043_3-5606515.html?tag=cd.top)

also looks like cg has gotten the nod.
the king is dead, long live the king

michagl
03-15-2005, 10:21 AM
Korval: On a personal note, programmable blending isn't that interesting to me. Useful? Certainly. But I'd prefer having a programmable primitive processor that can walk memory and feed attributes to the vertex shader. And I think that's where we'll go before getting programmable blending.thats funny, i actually posted a wish for this in one thread, then thought i would look like an ass, so i edited it out. anyhow, this is the next major hardware development i would like to see.

as for blending, i would like to see built in per pixel blend sorting. any chance for that? i could describe how i would imagine implimenting that in hardware, but it would just muck up the thread.

finally, this thread raised some questions about Cg versus GLSL for me. last i was writing my own shaders, i was using Cg, but i figure the API has changed by now... so i was planning to change to glsl next time i write a shader. for what its worth glsl was not available when i was using Cg. i've read the glsl specs, and am comfortable enough with them... and i figure Cg also supports glsl grammar. but politicly and technicly, can anyone make any recommendations?

lately i'm leaning towards sticking with Cg.

RigidBody
03-15-2005, 10:37 AM
hey aeluned,

money is only the root of all evil if you do not have enough of it.

here in germany, we are now having a lucky guy who won 21 million euros in the lottery, last weekend. imagine what it's like to have more money than you can spend? i can only talk for myself, but i think i'd be in a neverending state of being totally relaxed...which doesn't seem evil to me.

RigidBody
03-15-2005, 10:49 AM
but, to be serious:

hey hey, my my, opengl will never die!

if you think of games, well, i admit direct %)$#§
is superior. because there is not only direct3d, but also directsound, directwhatever.

but opengl came from the industry, and there it will remain. i for example work close to BMW in munich, and in research and development they are running linux. they use windows emulators to make their shiny powerpoint stuff, and maybe fill in their time sheets, but nothing more.

RigidBody
03-15-2005, 11:06 AM
and after i've read mark kilgard's post:

first, i like the way you are enthusiastic about your employer :D

but, as i said before: in the industry, windows is only used for making impressive transparencies. real work is done with linux (or unix, which is disappearing because sgi workstations are way too expensive compared with pcs).

linux rules not only because of opengl, but it has lots of tools like free editors, compilers etc. if you're an engineer you love command-line-tools like <grep>; only using the pipe helps you a lot.

i could not imagine my every-day-work being done in windows. and i agree, nvidia is providing fast, stable drivers. at my company- and at BMW, too- we are running linux with intel xeon and nvidia quadro xgl cards. keep up the good work!

CrazyButcher
03-15-2005, 11:51 PM
"real work" is hard to be defined so,
when you consider the games industry I would say most is done on windows/d3d.
though thx to ogl es and the new playstations this might shift a bit.

ogl will continue to have its lead in any "non game 3d apps" I guess.

and about nvidia, they certainly give a bit more love to ogl than others (when it comes to gaming stuff) but it doesnt help when it only runs on their hardware... I hope CgFX will be fixed to run on ATI, and I wished Cg runtime would be extendable so that some people could write "inofficial" profiles for say ATI FS or whatever... it is also weird that nvidia makes fx composer for d3d but no IDE for their own shading language Cg (although its almost same to hlsl), but I hope that ps3 will change this and give Cg a bit of a push.
for game stuff ogl clearly lost to d3d because of the lack of ARB extensions for really important features, but I think ogl ES will fight back cause it will be cleaner and less bloated, and actually be what people hoped 2.0 to be. I am still quite a noob on this stuff but thats my 2 cent

brinck
03-16-2005, 12:23 AM
Originally posted by RigidBody:

but, as i said before: in the industry, windows is only used for making impressive transparencies. real work is done with linux (or unix, which is disappearing because sgi workstations are way too expensive compared with pcs).Which industry would this be?

/A.B.

RigidBody
03-16-2005, 12:39 AM
first of all: real work is what _I_ do :D


Which industry would this be? automotive, for example...aerospace...any industry, which produces real stuff in big buildings with smoking chimneys.

most CAE/CAD software came originally from unix and is now ported to linux because of the comparably low hardware costs. somewhere in this forum there was a post of a guy from boeing who said they want to switch to linux.

i looked around the web and found ati and nvidia cards from 30-500 euros, which are good for gamers.

but in the high-end there is only nvidia with their quadro fx. the price STARTS at about 300,- for quadro 580 xgl and goes up to 5400,- for quadro fx 4000. there may be less engineers than gamers in the world, but if you compare the average price of their hardware, it may pay for nvidia. and of course, they cannot sell a 5000 euro card without offering a driver. which must be available under linux for the reasons which i gave before.

Coconut
03-16-2005, 01:27 AM
It may be true for BMW and VW where Windows is banned. In the US, the automakers are mostly windows based.

davepermen
03-16-2005, 01:28 AM
Originally posted by daveperman:

Originally posted by Zak McKrakem:
Who will be next? Mark Kilgard? Jon Leech?or me? please?somehow its depressing to see someone abusing the fact that no one can spell my nick correctly.. grmbl.

well, i continue to support opengl, but i'm no one big, so..

but i don't see dx as 'the dark side'. just.. the other side.

Lurker_pas
03-16-2005, 01:34 PM
<QUOTE>
but i don't see dx as 'the dark side'. just.. the other side.
</QUOTE>
Yes, the side you are on in your afterlife if you were a really bad guy:)
And now serious.
From my (hobbyst) point of view OpenGL is great. Easy to learn, easy to use. I used a bit of DirectX one time(long ago) through some Delphi components (was it DelphiX or something?), but it was not enough. I started browsing the source code. It was awfull. There was no chance for me learning it. So I choose OpenGL.I heard DX has changed a bit since then though...
DirectSound? OpenAL is also fine and integrates well. There is also DevIL. Almost everything I need for my humble purposes. I could be happy to have more time:)
I think the future of non-proffesional OpenGL is in ATI's and nVidia's hands - if they both implement something, it's there. And it's only the extensions (in my opinion) that OpenGL may lack. I am still waiting for ATI's FBO.
Cheers!

V-man
03-16-2005, 02:52 PM
Personally, I would like to see more games released using no part of DX and written in a portable way.
The biggest issue facing GL is PR : convincing companies to use it in favor of D3D.

Most of those guys running them companies think Windows is where the money is and they don't need to look farther than DX. Too bad Valve switched.

3k0j
03-16-2005, 05:32 PM
Originally posted by V-man:
Too bad Valve switched.I wouldn't worry about Valve. What else would you expect from the company that
1) was founded by happy former Microsoft employee(s)
2) trusted Microsoft "great software" so blindly they got hacked and had their upcoming game code stolen
3) despite having licenced fully functional OpenGL engine from Id Sofware for Half Life 1, they decided to implement Direct 3D path (which looked inferior on my system: all transparent surfaces were rendered opaque)

On the other hand, if you get their leaked code (may the gods forgive you) you'll see they must have ratained some strange sentiment to the OGL way :rolleyes:

CrazyButcher
03-17-2005, 01:34 AM
as a hobbyist as well imo where dx currently surpasses ogl is

"really nice sdk"
there is no real ogl sdk... or help or whatever, you have to dig up everything yourself, tutorials, specs and those oh so simple to use extensions specs. dx has a really good sdk imo, also comes with alot of extra tools.
one might say there is glut, but its not updated anymore and not "official".

"no extension wrangling"
you can use shaders more reasily and dont have to check for extensions, compilers... especially on older hardware (no ps1_x on gl)

gl too much is individual company's testbed, less united. of course that comes from the way it is, democracy is always harder than dictatorship hehe

so while gl is easier in the beginning (gl 1.1, redbook) once you want to do more it gets alot more troublesome to work with, unless you want to limit on vendor X

there is no initiative to push gl to more user friendlyness or better use, cause there is no official head, everything needs to be decided on in the arb, which takes too much time and seems to be in the background.
then again one should probably just accept that gl isnt meant for gaming.

knackered
03-17-2005, 03:29 AM
one should also accept that with flexibility comes complexity.
The good points in gl are impossible without the bad points you've highlighted.
Maybe it's not meant for hobbyists - it's not a "3d-game construction kit" that's for sure.

V-man
03-17-2005, 03:53 AM
Originally posted by 3k0j:

Originally posted by V-man:
Too bad Valve switched.I wouldn't worry about Valve. What else would you expect from the company that
1) was founded by happy former Microsoft employee(s)
2) trusted Microsoft "great software" so blindly they got hacked and had their upcoming game code stolen
3) despite having licenced fully functional OpenGL engine from Id Sofware for Half Life 1, they decided to implement Direct 3D path (which looked inferior on my system: all transparent surfaces were rendered opaque)

On the other hand, if you get their leaked code (may the gods forgive you) you'll see they must have ratained some strange sentiment to the OGL way :rolleyes: Since their game is so popular, they should be convinced to use GL and the rest of the industry will follow suit. A lot of people are like lemmings.
EA is the worst cause they put out so many games and I think 99% are D3D. They are the biggest game company I think??? They pull over 2 Billion$ per year. That's enough money to do anything.

I have never seen an interview where they are asked about this stuff, except for Carmack and we all know what he said.

CrazyButcher
03-17-2005, 05:03 AM
true knackered, the "fast access to latest tech" only works with the extensions, thats definetely a big plus. just leaves issues when it doesnt end up as ARB extension or no equivalent exists.
so the extension stuff and such will stay as is, cause of the nature of "open"GL

marco_dup1
03-17-2005, 05:31 AM
Originally posted by CrazyButcher:
"real work" is hard to be defined so,
when you consider the games industry I would say most is done on windows/d3d.
though thx to ogl es and the new playstations this might shift a bit.

AFAIK is the market of PC games much smaller as the console one. So I don't think DX is important.

RigidBody
03-17-2005, 09:36 AM
coconut-

i tried to reply before, but my post was deleted like it seems.

i didn't say there is no windows at BMW or VW. but it's only used for making fancy slides or other secretaries' stuff.

CAE/CAD is done with UNIX or LINUX. and somehow i can't imagine it's different in the us. tools like msc nastran, ls-dyna or pamcrash, which are used for stiffness and crash calculations, were ported to windows some years ago. but only for small companies which can't afford to buy a 20.000,- workstation.

knackered
03-17-2005, 11:03 AM
Originally posted by CrazyButcher:
true knackered, the "fast access to latest tech" only works with the extensions, thats definetely a big plus. just leaves issues when it doesnt end up as ARB extension or no equivalent exists.
so the extension stuff and such will stay as is, cause of the nature of "open"GLThere's also the backwards compatibilty that opengl's extension mechanism guarantees effortlessly. I can use a drawing method I programmed years ago that uses register combiners & texture shaders in the same application that uses drawing methods that use GLSL and VBO. This would be impossible with d3d - you have to throw the baby out with the bath water each time microsoft deem to release a new version of dx. That means you can't use your old dx8 drawing code if you want to exploit newer dx9 features, which is madness if you think about it.
I prefer backwards compatibility to constant redesign.

CrazyButcher
03-18-2005, 12:10 AM
you're right marco, the masses of games in total comes from consoles. and opengl/es should give gl a boost in games.

and knackered, while backwards compatibility surely is a plus, how much can you really "not reuse" in dx that you can reuse in ogl, something as vendor specific like nvi texture/rcs doesnt exist in d3d or ? so they go around this problem using more standardized tech or ? however as I know less d3d than ogl its more of an impression not really knowing...

knackered
03-18-2005, 01:30 AM
oooooooooo, butcher. The innocence of youth. They virtually redesign the whole thing with each new major version. I am not exagerating.
Microsoft are only interested in backwards compatibility for the end-user, not the developer at all. So, dx6 stuff still runs fine even though you've got dx9 installed (the dx6 interfaces are hanging off the back of the dx9 runtime via COM), but dx6 stuff will not compile unless you compile against the dx6 sdk...which means you won't be able to use anything out of any later versions of dx.

V-man
03-18-2005, 06:17 AM
Game companies are probably not interested in backwards compatibility.

1- Technology-wise, I can imagine that lots of developers want OO, the way that DX offers.
2- Quite a number of them want to program a single path and that's it.
3- DX is a collection of APIs. Perfect for newbies learning game making. I don't know if companies like EA care. I don't know if they use the D3DX functions.
4- I think the biggest reason is tradition. Companies started with DX. Why should they switch?

That's some of the stuff I collected. It's just my opinion.

knackered
03-18-2005, 06:27 AM
There's no such thing as a single path in dx. Caps replace extensions, that's the only difference.

tfpsly
03-18-2005, 07:31 AM
Originally posted by CrazyButcher:
you're right marco, the masses of games in total comes from consoles.I don't know where you live, but in my country it's more like 50%/50% (or 49%/51% to be precise).

marco_dup1
03-18-2005, 09:01 AM
Originally posted by tfpsly:

Originally posted by CrazyButcher:
you're right marco, the masses of games in total comes from consoles.I don't know where you live, but in my country it's more like 50%/50% (or 49%/51% to be precise).look at the sales volume world wide.

michagl
03-18-2005, 10:13 AM
i looked at dx once i think. it looks like most high level microsoft APIs to me. which translates personally into, 'no way am i touching this'. usually microsoft has lower level APIs though, which are less cumbersome to work with... at least for the os runtime APIs there is generally more than one way to get something done. so dx probably has a low level equivalent too.

but it definately looks like a hornets nest from the outside at least. speaking from the perspective of someone who has a personal ire for 'scheduled obsolete' code... i would say opengl is the pick for a strong foundation that will last through the ages with minimal upkeep. dx is maybe for something you plan to toss out back in a couple years or constantly renovate.

dx i figure probably bares the mark of microsoft code in general. extremely rushed, overmanned, and developed by people who have no real personal stake in it. ie: don't actually use and depend on the software.

personally though i just hate the way dx code looks, and have no investment in microsoft.

opengl i'm happy with though on the whole.

i could easilly see directx die, opengl, no way... people would rush to raise it from the dead if its corporate sponsors fell out.

JD
03-18-2005, 03:00 PM
I started with opengl then not understanding its heavy state machine reliance I switched to d3d and it wasn't until I needed register combiners that I switched back to gl and by then the programming paradigm has clicked in and I got rid of lots of d3d code during my project conversion to gl. Also, the way I kept in touch with gl while coding in d3d was by translating some gl code to d3d and the more I learned about gl the more I liked it. I grew up with d3d 5,6,7,8,9 and the constant recoding fundamental things like framework init code that got on my nerves. Did you know that d3dx7 had the shortest init code of any version? They also have that duality going between d3d and d3dx so if you strip away d3dx you get pretty bare api, much less than gl. Plus, I replicated most of d3dx in my own code for control purposes as that api has some nasty bugs. I don't want to wait for ms to fix them if ever. Plus, d3d is over designed and poorly documented for doing advanced things with it. I still remember pulling my hair out implementing the right function call sequence for the d3d font interface and have it work during d3d reset calls. You know, when you loose surfaces. Bah, I needed to get this out of my system.

knackered
03-19-2005, 04:18 AM
'kin awful, isn't it?

Zak McKrakem
03-19-2005, 05:30 AM
Most of the best selling 3D games for the last two years used OpenGL. There are also good games that use D3D so you can make good games independent of the API. Even titles from the last months like Chron. of Riddick are OpenGL and titles like World of Warcraft you can select OpenGL or D3D. For me, the only problem about OpenGL is the lack of good support of ATI for high end features (even when its hardware supports them) but they are improving it (they are now making the XBox and Nintendo's Revolution graphics chips so I presume that this is the cause of the slowdown in their driver enhancements in the last months).
I have learned last versions of D3D for helping other people and I really prefer OpenGL over D3D. But this is my personal opinion.
Anyway, I feel like there are more people switching back to OpenGL than to D3D now that we have a decent ARB collection of extensions. I also think that switching form 1.5 to 2.0 has been a good 'marketing' decision to get the attention of some developers.
Also the rumors of PS3 and Revolution to use OpenGL ES or OpenGL clone as their graphics API seem that is making some developers to look back to OpenGL to see that it has improved a lot since last time they probably 'visit' it (probably 1.2 times)
What I found for 'newcomers' is that they get confused when they see the big list of extensions and they don't understand why if some of them are deprecated, obsolete or included in the last versions of the API they are still exposed. For example, they don't know if they have to use CVA, VAR or VBO and they have to learn about them, ask other people to understand that VBO is the current recommended way and that CVA is even undefined.
Maybe a way to know real extensions for current API version could be good. I think that a function like glGetString(GL_EXTENSIONS_2), that shows only 2.x extensions, could be good (and GL_EXTENSIONS to continue exposing everything for not loosing backwards compatibility). This way we can forgot to see EXT_vertex_array in the extension list and have to explain new people that this is included in the API since 1.1 or 1.2 (I don't remember exactly)

knackered
03-19-2005, 06:16 AM
Don't agree. To address the problem you've highlighted (new developers being confused by multiple ways of doing things because old extensions are still supported in the drivers) you don't introduce a new way of querying extensions, you possibly produce and keep up-to-date a document in the extensions registry webpage which details the commonly used extensions. It's a documentation issue rather than a problem with the API.

CrazyButcher
03-19-2005, 09:53 AM
indeed, who hasnt visited delphigl's excellent extension useage site to look for what is spread. As I previously mentioned some sort of centralism when it comes to information (such as the dx9sdk) would be nice to have for GL. of course there is the spec and the extension specs but well not that comfortable imo.

V-man
03-19-2005, 01:30 PM
Originally posted by knackered:
There's no such thing as a single path in dx. Caps replace extensions, that's the only difference.Yes, but I said companies want to code a single path :)

Some people have this impression that you have to resort to vendor specific extensions in GL, while in DX you don't.

Look at HL2. Does it optimize for Geforce FX?
Does it use normalization cubemap lookup instead of math?
Does it use half precision temps? (if it's possible in D3D)

Doom3 goes through that trouble.

It's not a matter of API. It's a matter of people's opinion on the APIs and they want to do the minimum work.

JD
03-19-2005, 10:56 PM
I make my own gl headers so that's the only major work for me. Otherwise I rely on gl core and sgi docs and very few extensions. Also, why aren't there cap bits in d3d from ihvs? That boggles my mind. I don't have a computer with ati installed so I can't use that d3d caps utility to check them. Each d3d ihv driver should show supported caps. I remember pulling my hair out trying to figure out ddraw clipper and how to set it up. The migration from d3d 8 to 9 had tons of errors in the docs and I was constantly sending bug reports to dx team. They did write back though so that's good. It got so bad one time that I thought about using html help and decomposing them into html files and editing them by hand then zipping them back to chm file. Gl docs are awesome by comparison. Plus ihv extension docs are nifty as well as they give you much needed details. That VBO issue I had in gl ie. driving it efficiently is the same way under d3d. I noticed they had to write that proper VBO usage in d3d docs as well. Don't get me started on d3d texture stages and the really screwed up documentation as well as actual usage of that mechanism. Gl arb_multitexture is a dream to work with. Heck, reg.cmbs. are easier to work with than d3d texture stages. Crossbar is an absolute savior in gl.

jonasmr
03-21-2005, 01:08 AM
Originally posted by Mark Kilgard:

Direct3D out-software engineered the ARB when it came to engineering a programmable shading language. Direct3D builds its shading language implementation into a redistributable library that you can package with your application. The library targets a (tokenized) assembly interface. So a new language feature (or compiler bug fix) can be utilized without necessitating end-user driver upgrades (and reboots) by just using the latest compiler library.
Yes! This is precisely why I had to bite it, and turn to d3d. On the other hand, you will never make me use CG, because other hardware vendors will never support it. How long does it take for the ARB to realise that this was a bad choice?

Oh yeah, and another thing making me stay in d3d world - The Debug Runtime(tm), once you've used it, youll never turn back two 5-enum world(with INVALID_OPERATION being used say 99% of the time?).

LarsMiddendorf
03-21-2005, 03:08 AM
Direct3D out-software engineered the ARB when it came to engineering a programmable shading language. Direct3D builds its shading language implementation into a redistributable library that you can package with your application.ARB_fp/ARB_vp is obviously better than the D3D Shaders, because it abtracts the instruction and register count, but an extended vendor independent version for vs/ps30 is needed. ARB_fp/ARB_vp is a better compile target than GLSL, because GLSL is too high and introduces another level of indirection. Compilation does not necessary mean another language. Modern game worlds need many different shaders and not just one or two like Doom. I personally think that shaders won't be written, but instead dynamically generated from some higher level (possibly, but not necessary graphical) material definition like in UE3. You need different shaders for each light material interaction and for each pass and it is impossibly to create just another shader or a long .fx file because texture x should scroll. I don't see how current high level languages can solve this problem in an elegant way. So the asm style interface is still usefull.

execom_rt
03-21-2005, 03:41 AM
ARB_vp/fp better than D3D ?

If you using the 'extended' ARB_vp/fp, then yes, it's better.

ARB_vp/fp

pro:
- Invariant position support (smaller vertex program, so more efficient).
- states matrices (saves some environment constants, useful for code using skinning for example, where you need a lot of environment variables for storing the bones matrices).

cons:
- limited macro support (m4x4, sin for example, you need to emulate it (writing sin in VS shader on OpenGL is quite 'annoying', you need the nVidia extensions in order to get it, which is proprietary).

- no support for "older" video card like Radeon 9200, yes it's a old card, but Apple is still shipping it in there Mac Mini, so you miss a lot of features (could do bump + specular mapping, or some nice effects, using proprietary ATI extensions is out of question ... ).

- no vertex program support on some 'indie' video card like S3 or Matrox - no ARB vertex program, which are available even in DirectX (and it exposes Vertex Shader version to 2.0 !). Simply thoses company cannot afford OpenGL development.
For DirectX driver, Microsoft already gives specs for writing DirectX driver, so it's easier to start with. For OpenGL, each vendor is on their own apparently.

- no generic software TNL for every video cards that doesn't support it (Apple does it).

For example, Apple exposes ARB_vertex_program in software on every platform, even ATI 128 video card. Yes it's software emulated, but it is quite convenient and performant enough.

- no GL vertex shader on older ATI video cards. Even the Geforce3 can do GLSL in vertex program.
You have a limited support, but you can write a GLSL that just fits the specs. of the Geforce3.
It saves a lot of work.

Same for GLSL fragment programming on ATI 9200. For example, I'm writing HLSL (DX9) code that fits with a PS2.0 model, then I write the same code that fits the PS1.4, just changes a bit the code in order to remove the limitations (like doing norm. cube map for normalizations ...)

So basically D3D shaders is 'more' scalable (support more hardware finally) than OpenGL.

If you plan to release a game, D3D gives more video cards support, so more customers.

I guess it's the 'real' reason. However this argument might be 'off topic' in the next year to come.

And finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) "for security reasons". That, will be a killer.

(see http://www.theinquirer.net/?article=21077 )

mikef
03-21-2005, 04:14 AM
Originally posted by execom_rt:
An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) "for security reasons". That, will be a killer.Wow, that would indeed be a killer...so new extensions would be basically impossible, perhaps excepting those that just expose DX10 functionality? Yikes.

Has the ARB ever certified a wrapper as a conformant OpenGL implementation? Or would they be certifying the wrapper+DX10 driver taken as a whole?

LarsMiddendorf
03-21-2005, 04:28 AM
Originally posted by mikef:

Originally posted by execom_rt:
An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) "for security reasons". That, will be a killer.Wow, that would indeed be a killer...so new extensions would be basically impossible, perhaps excepting those that just expose DX10 functionality? Yikes.

Has the ARB ever certified a wrapper as a conformant OpenGL implementation? Or would they be certifying the wrapper+DX10 driver taken as a whole?I've read that only the MS default driver will be a wrapper. A good thing is that it will support OpenGL 1.5 .

sqrt[-1]
03-21-2005, 04:47 AM
Originally posted by execom_rt:
ARB_vp/fp better than D3D ?
cons:
- no support for "older" video card like Radeon 9200, yes it's a old card, but Apple is still shipping it in there Mac Mini, so you miss a lot of features (could do bump + specular mapping, or some nice effects, using proprietary ATI extensions is out of question ... ).
Yes considering that the ARB-fp is "ps 2.0" what did you expect? Also note that not all things are rosy on the D3D side of things in early ps1.1-1.4 era as there are many valid shaders that ATI/Nvidia cannot perform. It is only when you look at the OpenGL docs that you find out why they break. (eg. Nvidia - with 3 contant lookups in one instruction, ATI - some of the more complicated tex3x3vspec type instructions) So you end up having to write multiple shaders to work around driver issues anyway...



- no vertex program support on some 'indie' video card like S3 or Matrox - no ARB vertex program, which are available even in DirectX (and it exposes Vertex Shader version to 2.0 !). Simply thoses company cannot afford OpenGL development.
For DirectX driver, Microsoft already gives specs for writing DirectX driver, so it's easier to start with. For OpenGL, each vendor is on their own apparently.
I dunno about matrox but other "small" companies like S3/SiS and Intel do advertise ARB-vertex/fragment_program



- no generic software TNL for every video cards that doesn't support it (Apple does it).
Nvidia does, so complain to ATI (or perhaps MS?)



If you plan to release a game, D3D gives more video cards support, so more customers.
I would seriously doubt that any PC gamer would not have a decent OpenGL implementation running. (mostly thanks to the many FPS's using OpenGL) If you are targeting your game at low end "Mom-and-Dad" players, you can almost forget high end features anyway, as even IF they have a half decent video card, the drivers will be so out of date most features will break.



An finally, in Longhorn, OpenGL will be on top of DirectX 10. It will be wrapper (GL->DX) "for security reasons". That, will be a killer.
(see http://www.theinquirer.net/?article=21077 )Don't believe everything you read. MS is just updating the default OGL driver that ships to be 1.2 and use D3D for hardware acceleration. Once a user installs the manufactures drivers, OGL will be as it is now.

santyhamer
03-21-2005, 05:05 AM
Ok here are my opinions/feelings:

· I wanna sell my soul and spirit to M$oft, because OpenGL is a non-profitable?? thing and M$oft is the devil, has da $$$, is sodoma-gomorra and can make me rich as rich as the ppl you mentioned above!!! Da problem: Murphy's Law... when you can sell your soul to Mr.666 the soul bag is FULL !!! Now FEEL the 666(9?) powah! (http://www.vgln.com/community/albums/album7/pages/012_10_0001_jpg.htm) :D

· M$oft promised me if I use XNA/DX10/WGF they will allow me to play with its XBox2/360 Halo97 !!! yay!! Sony promised me if I use its PS3 with OGL they will send me a free copy of the new Final Fantasy 943... BUT see this (http://news.com.com/2100-1043_3-5589309.html) ouch!!!.

· M$oft told me assembler/C/C++ is not good, is better to use Visual Basic style .NET "managed" thingy ... but atm .NET doesn't support well the C function pointers to use OpenGL extensions, so I need to use a very ugly 3rd party wrapper for OpenGL ... (ok Tao (http://www.taoframework.com) is pretty decent atm, but is a 3rd party, not official thingy, can dissapear one day and then what?? )

Well at this point I am using Managed DirectX 9 / Windows Graphics Foundation over c# .... So now I am a very happy windozed person. I really prefer this
Disk.Format(Drives.C,Wait.Yes)to this
(pFnFormatHardDrive)wglGetProcAdress("ARBFormatDrive");
pFnFormatHardDrive(GL_RGBA/*ouch C is NOT strong typed so I can pass here this constant instead the correct one*/);
while ( 0==glIsFormatCompleted() ) {}· Longhorn ( WinFX API really ) and non-managed C++ will be a BAD IDEA ... so bye bye native GLUT and openGL C function pointers to use extensions, etc...

See this Windows Longhorn FAQ (http://msdn.microsoft.com/longhorn/support/lhdevfaq/default.aspx) . Go to General Topics -> The Basics -> Can I use C++ to develop for Longhorn? "If you mean Standard C++ that targets a specific chip by producing assembly code, you're going to be doing a lot of .NET interop to make it work. I recommend the other thing"

Also, see this link about <a href="http://www.extremetech.com/article2/0,1558,1629317,00.asp" target="_blank">
Windows Graphics Foundation</a> . No more VS/fragment shaders!!! UNIFY them! No more device caps! IHVs forced to accept min specs! Use GPU as a calculator! No more fixed fipeline and old things! Real HW-3d accelerated desktop! Alpha faded 3d windows!

Free your mind, I know you are scared .... People fear the changes and are not prepared, and blah blah

· I think the future are languages like Java and .NET/linux go-mono. Assembler/C/C++ will be dead.
Oooh, see... casually Mr.Epic Tim Sweeney is thinking about The New Programming Languages (http://archive.gamespy.com/legacy/articles/devweek_b.shtm)

OpenGL should be prepared for the revolution because it is not atm ... DX is, using MDX9/WGF. See this interview with Miguel the Icaza about
<a href="http://www.theregister.co.uk/2002/02/01/gnome_to_be_based" target="_blank">
Gnome 4.0</a> . He says "Gnome 4.0 should be based on .NET"

·I am VERY scared about the future ...

- PlayStation 3 uses OpenGL2-NVIDIA "DreamForce" with C/C++ over IBM Cell procesorS. See PS3 is easy to program (http://www.neoseeker.com/news/story/4385/) and IBM\'s Cell Processor in detail (http://www.blachford.info/computer/Cells/Cell0.html) Notice the Cell processor would accelerate by hardware Java or .NET, or even drive your car!!! Also, see how this Skynet machine can kill the humanity... err.. I mean IBM BlueGene Supercomputer (http://domino.research.ibm.com/comm/pr.nsf/pages/rsc.bluegene_2004.html) can pass the PetaFLOP capacity using Cell in the future !!! Also see the Upcoming IBM\'s CPUs after the Cell (http://www.xbitlabs.com/articles/editorial/display/tech-process_10.html) which is REALLY impresive ( if you know chemicals, that form of carbon is after the Fullerene, what is almost a Nobel-prize thing... )
I am sure MarkJ can tell us a few about the PS3, we have to press him until he violates the NDA and tell us all the Zion codes...err...PS3 info and NVIDIA G70 "DreamForce" GPU...

- XBox2 uses XNA and .NET, ATI R500, over PowerPC procesorS. I am not completely sure what the hell is XNA (http://www.microsoft.com/xna) Perhaps is Visual Studio .NET + a good asset manager like Alienbrain ala M$oft style? Could I program the xbox with c# some day or will be only C/C++? Does xbox2 uses longhorn CE powerPC version?

- Nintendo Revolution probably will use Java, ATI Hollywood GPU over a PowerPC. See this Nintendo Revolution FAQ (http://cube.ign.com/articles/522/522559p1.html) for more details.

- My PC uses Longhorn beta with VS2005 .NET over an AMD64. I program with c# 2.0 and Managed DirectX/WGF beta.

- My mobile phone uses Java 2 Micro Edition with MIDP2/JSR184 over ... a 1Khz processor???? Oh see, Carmack is playing with mobile phones too http://www.armadilloaerospace.com/n.x/johnc/Recent%20Updates

Is the future a chaos where portable tools like the wonderful multiplatform 3d engine Renderware Graphics 3.7 (http://www.renderware.com) are not possible???

We should made the Fahrenheit thingy years ago... Now I see the darkness of 900000 different programming languages/APIs coming thru and making the things even more difficult for programmers who want to port its games to multiple platforms...

· OpenGL is not better/worse than DX... Is just different ... OpenGL is based on the old C style paradigm with portabillity in mind, while DX9 is in object oriented C++ paradigm with PC/Xbox in mind and WGF/Managed DX9 is pointing to the future "managed" .NET languages and operating systems.
I see evolution in M$oft... but I see only small changes in OpenGL. OpenGL HAD the innitiative when Win95/WinG/Talisman appeared... and now is OpenGL who follows and copies DX... mmmmmmm... think about this ... while we were programming shaders using assembly DX had HLSL with shader model 3, PRTransfer, fx techniques, D3DX library using SSE, HDR, geometry instancing, good managed .NET wrapper, debug runtime ....

· We saw important people to left OpenGL, starting with M$oft. Now we see other ppl to go and other people to support xbox2 instead of PS3 ... "This is a war, and we(ogl) are..." in the middle ...

· So wake up Neo, welcome to the real world and follow the M$oft palladium ribbit because OpenGL is a VIRUS for Windows!

· Sorry , this won't end tonight, it is INEVITABLE, Mr.Gates :D

· UUB Codes for LIST sukz and not, I AM NOT obsesionated with Matrix.... :p :p :p :p :p

Lurker_pas
03-21-2005, 07:12 AM
Wow. That's one of the most interesting posts I have read(also regarding the style:). Really scarry links. Scarry and hopeful. Scarry for my current habits. Hopeful for the future abilities. However I think OpenGL will adapt. OpenGL or OpenGL ES or OpenGL whatever. It will. I hope... Changes are indeed inevitable.

santyhamer
03-31-2005, 04:22 AM
btw... if ps3 uses opengl2.0 and cg... means that the SDK will be open for homebrew? That would be nice :D

dorbie
03-31-2005, 11:31 AM
No it means no such thing, the use of an API has no bearing on whether the SDK & development licenses will be available or if you will be able to run anything not signed by Sony. The console business is one of software publishing & licensing, they control the titles, quality and revenues by rulling their platform access and rights to publish with an iron fist, this won't change.

It's confirmed that the PS3 will make OpenGL|ES API available to developers for their platform,

http://news.com.com/PlayStation+3+to+be+easy+on+developers%2C+Sony+vow s/2100-1043_3-5606515.html

ES eliminates a bunch of cruft from the API for example the unpredictable legacy vertex by vertex immediate mode dispatch and vestigial stuff like accumulation buffers that nobody uses for real stuff etc. It's simpler to implement & test and encourages developers to do the right thing (although some of them still manage to screw up).

Edit: I thought they'd go with an existing version extended but Khronos' headline states OpenGL|ES 2.0 even though the article they link to doesn't mention a version. Version 2.0 will have much more interesting capabilities, I'm sure they'd still add extensions. It doesn't really matter a great deal though, once they use the API they have a clean interface and they can extend it till the cows come home, I don't see formal OpenGL versions and compliance as an issue for a console. If they're in the right ballpark it should be good enough, developers will much more readily adopt proprietary extensions on a console platform and avoid unsupported areas.

Wow, if this is the only option for 3D programming there may be a lot of unhappy VU microcode hackers out there :-). Variations on ARB vp and fp will have to do.