PDA

View Full Version : DirectX 10 wrapper over OpenGL feasibility?



marcus256
05-05-2007, 01:34 AM
Since I don't know much about the structure of DX10, someone else with more knowledge might be able to enlighten me about the complexity of making a wrapper for DX10, on top of OpenGL (etc).

The goal would be, of course, to be able to run DX10 titles without Vista (e.g. on XP, or perhaps even under Wine).

I know DX8 is available under Wine - and it works fairly well in my experience. Perhaps that code could be used as a base?

Roderic (Ingenu)
05-05-2007, 02:40 AM
I've written an abstraction layer for my engine renderer to work with D3D10, basically adopting D3D10 featureset/structures, it wasn't difficult, althoughat the time some extensions were missing so I couldn't cover all D3D10 features with OpenGL.
(Now the required extensions are available.)

How to load the proper DLL and all might be more difficult. (Especially since D3D9 tend to have several release for the same version with updated DLLs with another name, that would mean updating the project as often as D3D10 get updates.)

Demirug
05-05-2007, 10:14 AM
Wine have already integrated big parts of Direct3D9 and shortly started with Direct3D 10 as a google summer of code project [0].

The guys from Falling Leaf systems working on such a project [1] called “Alky Project”. Don’t bother to download the pre alpha preview. The only thing that currently works correct is clearing the screen. The reason for this may that they use only fixed functions yet.

I am played with the idea by my own [2]. One of the bigger problems is the binary encoding of the shader and effects files. They are not documented at all but the core API will only see these as it is still recommend to compile shaders and effects on the developer side and only distribute the binary versions.

[0] http://www.winehq.org/?issue=329
[1] http://alkyproject.blogspot.com/
[2] http://gpu-fun.spaces.live.com/blog/cns!FB5EEACE075E1350!240.entry

Korval
05-05-2007, 11:54 AM
You should wait until Longs Peak and Mt Evans come out before attempting this.

Demirug
05-05-2007, 12:18 PM
I agree that Long Peaks and Mount Evans should make the resource handling easier. But there are enough parts in such a wrapper that don’t depend on the resource handling at all.
You could already implement:
- the effect framework
- the shader reflection
- state block management
- The binary SM 4 to GLSL converter. ( I assume that LP und ME will still use GLSL)

ZbuffeR
05-05-2007, 12:19 PM
LP or ME would not change anything to the "bigger problem" of reverse-engineering these binary shaders.

EDIT: Demirug was faster than me :-)

marcus256
05-06-2007, 03:53 AM
From what I understand, part of the Alky project is to reverse-engineer the shader code - has it already been done?

Nice to see that the D3D10 project has been approved as a Google Summer of Code project! Perhaps some nice things will happen. :)

Demirug
05-06-2007, 05:54 AM
Originally posted by marcus256:
From what I understand, part of the Alky project is to reverse-engineer the shader code - has it already been done?The pre alpha version doesn’t contain a sign of this. But if they want to be successful they have to master this step.

Korval
05-06-2007, 11:24 AM
- the effect frameworkIsn't this part of D3DX, which is an external library and therefore doesn't need to be emulated? It just makes D3D calls, so it would make D3D wrapper calls.


- state block managementThat is something LP would help with, as it already allows for this kind of thing.

Demirug
05-06-2007, 12:28 PM
Originally posted by Korval:

- the effect frameworkIsn't this part of D3DX, which is an external library and therefore doesn't need to be emulated? It just makes D3D calls, so it would make D3D wrapper calls.With D3D10 the effect framework is moved to the core. Therefore a (complete) wrapper needs to contain it. A number of apps will properly run without it as its use is optional.


Originally posted by Korval:

- state block managementThat is something LP would help with, as it already allows for this kind of thing. This would not help as the state blocks works on top of the D3D device. After you have applied a state block the getter of the device object need to return the D3D 10 objects that were stored in this state block. Additional you would need some functions that manipulate the state block masks.

marcus256
05-07-2007, 02:54 PM
Demirug, nice work with your wrapper!

I actually did a Glide wrapper on top of OpenGL some years ago (Remember Glide?..). It was never really finished (who would need it anyway...), but I could actually play a few games.

I guess wrapping DX10 is a completely different and more complex issue, but some things are most likely similar - such as not being able to make a 1:1 mapping, at least not with full hardware support.

Brolingstanz
05-07-2007, 03:50 PM
Remember Glide?Wow, I thought everyone was dead :-D

Hehe... developing with Glide was bittersweet. On the one hand it was wonderful to have acceleration at all, yet on the other it was often sensitive to how polygons were clipped and projected, and sessions tended to end rather unceremoniously in the obligatory 3-fingered salute ;-)

I think the only game I ever played with direct Glide support was Unreal, which also supported GL and DX, but if memory serves Glide was the most stable of the 3 (until my card went up in smoke).

marcus256
05-09-2007, 12:21 PM
Leghorn,

Yes, I remember clocking my Voodoo3 until I got rather wicked snowy triangles in Unreal. Really cool game at the time, b.t.w. The best reason to buy a Voodoo card.

...unspeakably good thing that Carmack came along and pushed OpenGL over Glide (if I remember correctly Direct3D wasn't even invented at that time?).

knackered
05-09-2007, 12:46 PM
at the time of the voodoo3??? course it was invented - version 3 or something. D3D was even supported on my old Matrox Mystique, the card with the worst poly rasterizing acceleration ever made.
glide was horrible, I used the minigl driver for my voodoo2. getprocaddress,getprocaddress,getprocaddress.....
All thanks to carmack.

dorbie
05-10-2007, 12:52 AM
Yup it was invented, and if Carmack hadn't stood up to Alex St. John's unmitigated bull**** we'd all be stuck with D3D and it wouldn't have tried very hard to improve much over version 4 (which if memory serves was the big revision to clean things up).

D3D as you may know started out as RenderMorphics' software engine, which was a rather decent retained mode software engine. Microsoft bought the company and unfortunately simply exposed the internals, the rest is history.

dorbie
05-10-2007, 12:58 AM
The current D3D crisis is that DirectX 10 is only deployable on Vista, so it's basically dead in the water thanks to an artificial strategy of Microsoft's to promote Vista by holding XP back, because nobody is going to use D3D 10 as the primary target until people migrate from XP (which just isn't happening). Worse, Vista desktop is actually based on DirectX 9.0 so even on Vista you're not guaranteed 10.0 you'll pretty much have a 9.0 "Vista Capable" platforms in a LOT of systems out there.

So now we get to wait and see where the dust settles on the latest episode of Microsoft's control freakery. It would suck if you'd just invested in DirectX 10 technology and it can't even be exposed on most gaming platforms in the market because Microsoft is screwing everyone's pooch.

At least OpenGL will not be held back by Microsoft's marketing. The hardware vendors have not yet been robbed of wglGetProcAddress.

ScottManDeath
05-14-2007, 05:43 AM
But so far, OpenGL was held back by the ISV/IHV, so no advantage here...

Zengar
05-14-2007, 06:51 AM
I am very intrigued if AMD/ATI will support the new SM4 extensions for their new cards... I haven't seen any notice on their website.

yooyo
05-14-2007, 06:59 AM
AFAIK, ATI has new OpenGL driver for Vista (built from scratch), and they have plan to put this driver for XP too but later. When? I dont know... Maybe this year. Right now, ATI have two OpenGL drivers, for XP and for Vista. Both drivers have bugs (or features?) but know workarounds on XP will not work on Vista.

knackered
05-14-2007, 07:58 AM
ATI, the Vauxhall of graphics cards.

ZbuffeR
05-14-2007, 09:16 AM
Maybe they will even opensource their drivers :
http://linux.slashdot.org/linux/07/05/13/1659245.shtml

knackered
05-14-2007, 10:56 AM
great, we can fix their bugs for them.

V-man
05-15-2007, 03:52 AM
Originally posted by yooyo:
AFAIK, ATI has new OpenGL driver for Vista (built from scratch), and they have plan to put this driver for XP too but later. When? I dont know... Maybe this year. Right now, ATI have two OpenGL drivers, for XP and for Vista. Both drivers have bugs (or features?) but know workarounds on XP will not work on Vista. I saw acouple of sources say they have recoded their GL driver for Vista from the ground up so I think it's normal to find different bugs and performances.

knackered
05-15-2007, 06:55 AM
catalyst the unsinkable.

zed
05-15-2007, 02:27 PM
**** its iceberg season as well :(

reminds me of a joke i recently heard at work, concerning billgates and his kid

bg - son since your tenth birthday is coming up ill buy you anything you wish, just name it
son - well dad ive always wanted since i was little a micky mouse outfit
so bill brought him ATI!

-----------------------

personally i think ati are going the intel graphics method, high volume low cost parts == good profit. graphics cards are getting a bit stupid anyways, look at the latest benchmarks, theyre forced to benchmark at least 1600x1200 with high AA/AF to not become CPU limited.
what with them using 200+ watts + the size of a small sedan, its time to take the foot off the accelerator pedal (not that they will)

knackered
05-15-2007, 03:20 PM
let me get this straight, you're saying graphics cards are now too fast?
I'm certainly not cpu limited - the last app I did was definitely vertex shader limited. We were lucky enough to test it on the latest nv quadro, and the leap in performance in that vs limited app was staggering - so up went the vertex count (to get close to vertex-per-pixel).

V-man
05-15-2007, 03:35 PM
Originally posted by zed:
look at the latest benchmarks, theyre forced to benchmark at least 1600x1200 with high AA/AF to not become CPU limited.Somehow that doesn't make any sense.
With low resolution (640x 480), you will get CPU limited and GPU will be idle.

1600x1200 with high AA/AF makes it GPU limited.

zed
05-16-2007, 12:43 AM
yes thats also what im saying
take for instance anandtech
theyre benchmarking at 1280x1024 1600x1200 1920x1200 + 2560x1600!
the smallest resolution is larger than 720p!
they have to choose such a large res or else you wont see a difference between the cards

btw check out battlefield2 2560x1600 + 4xAA gf8800ultra = 109fps (over 100fps or even 60fps is enuf for most ppl)

Brolingstanz
05-17-2007, 05:01 AM
Marcus, you had snowy triangles, I had a permanent christmas tree in the upper left corner of the screen. Between us, we had a pretty sad nativity scene.

V-man
05-17-2007, 08:25 AM
Originally posted by zed:
yes thats also what im saying
take for instance anandtech
theyre benchmarking at 1280x1024 1600x1200 1920x1200 + 2560x1600!
the smallest resolution is larger than 720p!
they have to choose such a large res or else you wont see a difference between the cards

btw check out battlefield2 2560x1600 + 4xAA gf8800ultra = 109fps (over 100fps or even 60fps is enuf for most ppl) Ah, so it just keeps the GPU busy but the CPU stays the same. Nice stuff.