Mount Evans - Vista only ?

Clearly this is up to the vendors - but I’m wondering if there will be some technical limitation that will prevent Mount Evans from running on XP. The thinking goes…

Mount Evans uses geometry shaders -> geometry shaders are a new part of DX10 -> DX10 only runs on Vista -> Complete the circle…

I confess to total clulessness on how to actually implement drivers. I just like to complain about them when I run into problems.

I’m wondering if there will be some technical limitation that will prevent Mount Evans from running on XP.
No, there will not.

The only technical limitation that prevents them from working on XP under Direct3D is Microsoft’s unwillingness to create an API to allow driver developers to expose them. Which isn’t a technical limitation at all; it’s a decision by Microsoft.

GL 2.1 already has geometry shaders exposed through extensions, which work just fine on XP, so clearly the concept is possible.

Mount Evans uses geometry shaders -> geometry shaders are a new part of DX10 -> DX10 only runs on Vista -> Complete the circle…

The mistake is the assumption that geometry shaders are only a part of DX10.

DX10/Mount Evans requires geometry shaders and exposes an API to use them (obviously).
OpenGL does not require geometry shaders but exposes an API to use them if available.

Therefore, there is no reason why Mount Evans shouldn’t be available on XP (Linux, MacOS, etc) as it does not have a dependency on DX10.

You can even use geometry shaders on Linux! And WinXP drivers with geometry shaders support are available since the release of 8800 cards…

Originally posted by Foxbat:
Mount Evans uses geometry shaders -> geometry shaders are a new part of DX10 -> DX10 only runs on Vista -> Complete the circle…

You’ve been registered on these forums since 2002 and you still don’t know what OpenGL is?

Yes, Mount Evans is another form of GL. You seem to be assuming it is DX10.

Even worst is that you think geometry shaders can’t work on any other OS. It is a technique, algorithm, math and certainly can be made to work on anything.

You’ve been registered on these forums since 2002 and you still don’t know what OpenGL is?
This is actually completely understandable. There’s the new-to-opengl possibility of being unfamiliar with available extensions but more prominently and more likely it comes down to marketing.

Do you remember the day that OpenGL introduced the geometry shader extension? You don’t because the extension simply came with the GeForce 8 driver.

Contrast that with DX10 marketing: The GeForce 8 series is the first DX10 compatible card! Wow! It makes it sound like only DX10 makes these features available.

How could OpenGL extensions be marketed? They just don’t sound sexy: The GeForce 8 series sounds the OpenGL GL_EXT_geometry_shader4 extension! Huh? While the average user may not understand what compromises DX10 they definitely don’t know why they should care about this extension.

OpenGL just doesn’t have the marketing force to compete with the DirectX branding.

The only thing that prevents DX10 to be running over WinXP is the VRAM virtualization due to the new driver model.

OpenGL ME should work ok on windows XP… and, in fact, probably will make Microsoft to release DX10b for XP :stuck_out_tongue:

Originally posted by santyhamer:
…VRAM virtualization…
Apparently that isn’t a requirement either anymore.

Regards
elFarto

Originally posted by santyhamer:
OpenGL ME should work ok on windows XP… and, in fact, probably will make Microsoft to release DX10b for XP :stuck_out_tongue:
The only reason Microsoft would consider such a move would be if OpenGL ME (or actually any version) was truly a threat to its market dominance. When OpenGL 3.0 or ME come out if there’s not a big marketing effort it will simply coast along like current OpenGL versions (with respect to market share).

Developers need to feel a real reason to develop in OpenGL vs. DX10 for future projects. Regardless of subjective or objective superiority of OpenGL there still isn’t that drive to make developers want to switch over.

Developers need to feel a real reason to develop in OpenGL vs. DX10 for future projects.
You mean, like not having to develop multiple rendering backends to cover Vista and XP? This is a fairly compelling reason.

Now granted, that’s not enough, because OpenGL implementations have a well-earned reputation for instability. That more than anything is something IHVs are going to have to be concerned about. And not mere talk either; I mean that 6 months after 3.0’s spec hits I want to see near-rock-solid drivers.

Agreed. But if we actually get working XP, Vista, Linux, Mac and PS3 drivers, that would be real drive.

Since MS will never implement it for the 360, developers will be forced to develop for both, anyway. I don’t bother about D3D being more dominant. All i care for, is that OpenGL doesn’t die. A long time it looked like that.

And, to be honest, OpenGL 2.x has become a pain in the ass to work with.

Jan.

Now granted, that’s not enough, because OpenGL implementations have a well-earned reputation for instability. That more than anything is something IHVs are going to have to be concerned about. And not mere talk either; I mean that 6 months after 3.0’s spec hits I want to see near-rock-solid drivers.
Honestly, I’ve had fewer issues in OpenGL as of late. Granted, I don’t have a config lab backing my OpenGL project, so it isn’t a truly fair comparison. My recent issues included:

D3D:

  • BSOD on NVidia hardware when using driver resource management.
  • Alpha test doesn’t work properly with R32F or R16F render targets on X1K cards with latest drivers.
  • Poor performance on GeForceFX series because Direct3D doesn’t expose 12-bit fixed point functionality in shaders. I consider this a bug with Direct3D, even though it was really a D3D9 design consideration.

OpenGL:
Early Windows Vista drivers didn’t perform the alpha test when rendering to a depth only render target on the NVidia GeForce 8 series. Adding a color render target and enabling color writes is a work-around this problem.

I’ve hit no ATI bugs at this time, but I don’t run on ATI cards as often.

Now, like I said, I don’t yet have a publisher’s config lab testing testing my OpenGL project. I’m certain there would be more issues if I did. Also, I tend to use graphics hardware in a “very ordinary” way (I intentionally try to avoid things that have a lot of emulation behind them [such as fixed function or accumulation buffers]).

All of that said, WRT D3D10 and Windows XP, I’m actually planning on switching our D3D9 renderer over to OpenGL ME/LP for the next game. I personally feel that writing two renderers is an unacceptable waste of time. Not only that, I’d like to use features on shader model 2.0/3.0 cards that D3D9 simply doesn’t expose (such as sampling from depth textures without the mandatory depth test).

Also, I’m willing to bet that the next version of OpenGL will be fairly stable since there will be a lot less code between programmers and the hardware. It’s a risk, sure, but the payoff could be potentially large if we can support a lot of D3D10 features in Windows XP.

Kevin B

Please don’t call it OpenGL ME. It makes it sound crap. Call it 3.1 if you can’t be bothered typing Mount Evans.

Originally posted by Korval:
You mean, like not having to develop multiple rendering backends to cover Vista and XP? This is a fairly compelling reason.
If OpenGL (Kronos group?) can get this “objectively technological superiority” feature displayed prominently on an IHVs product web page I will start to believe that more people will be inspired to develop in OpenGL. Don’t underestimate marketing and market share. I know it it makes sense to develop in OpenGL. You know it. But until John-Q-Programmer knows it uptake won’t significantly increase.

Originally posted by knackered:
Please don’t call it OpenGL ME. It makes it sound crap. Call it 3.1 if you can’t be bothered typing Mount Evans.
True, I’m too lazy to always type Mount Evans. But it’s still unknown whether it will be branded OpenGL 3.1. They could always go with OpenGL Kool-aid or some other arbitrary moniker.

Originally posted by elFarto:
[b] Apparently that isn’t a requirement either anymore.

Regards
elFarto [/b]
This source is well known for getting things wrong when it comes to graphics stuff. Supporting the WDDM 1.0 virtual VRAM model is still required for every WDDM driver. This includes drivers for older hardware too. It looks like that the requirement to support WDDM 2.x virtual video memory for D3D10.1 hardware is dropped.

Anyway the virtual memory was never the show stopper for D3D10 on Windows XP. The necessary behaviors can be implemented directly in a driver. Just more work for the driver teams.

Originally posted by Jan:
Agreed. But if we actually get working XP, Vista, Linux, Mac and PS3 drivers, that would be real drive.
On the PS3 nobody who isn’t totally crazy would waste CPU cycles for an OpenGL wrapper.

Originally posted by knackered:
Please don’t call it OpenGL ME. It makes it sound crap. Call it 3.1 if you can’t be bothered typing Mount Evans.
I’ve been using 3.x, mostly because of the Long Peaks ‘reloaded’ which was mentioned at SIGGRAPH which could give us a minor version bump right away.

Originally posted by Jan:
[b] Agreed. But if we actually get working XP, Vista, Linux, Mac and PS3 drivers, that would be real drive.

Since MS will never implement it for the 360, developers will be forced to develop for both[/b]
I want Fahrenheit to come back mwahahaha!

Fahrenheit was a scene graph which was still-born. Fahrenheit Low Level was never more than a twinkle in someone’s eye. They both get vaporware awards (Only the earlier Cosmo shipped to support the CAD middleware (which DID NOT sit on Fahrenheit Scene Graph) and if you saw the low level draw code in Cosmo you’d weep, I snuck a look, it didn’t even have unrolled loops for vertex dispatch). That wikipedia article is has some major problems. Performer ported to Fahrenheit?! Maybe on a powerpoint slide somewhere.

I see NVIDIA has a scene graph now (haven’t checked it out)… If you long for those days perhaps you should take a look, or there’s Open Scene Graph. You may be shocked by the actually availability of a working API, so brace yourself.

I took a look at nVidia’s Scenegraph some time ago. It seemed to be designed more for a renderer, than anything else (no real game-engine scenegraph). It was interesting to see someone else’s scenegraph-implementation and i got a few idea’s from it, but as the foundation for a real 3D application it didn’t seem suited, to me.

I don’t think we will ever see a standardized graphics API, that is operating on such a high level. It will restrict developers too much. I think it will always stay as it is now. Simply license the engine, that is most suitable for your application or write your own using a low-level graphics API (OpenGL, D3D).

Jan.

I’ve been using 3.x, mostly because of the Long Peaks ‘reloaded’ which was mentioned at SIGGRAPH which could give us a minor version bump right away.
That’d be OpenGL 3.reloaded, or 3.R, or for the mathematically inclined, R^3 (GL^3!).

Christ, I just can’t stop marketing.