PDA

View Full Version : Dual monitor with OpenGL ?



DelNeto
07-22-2002, 04:14 PM
Hi,

Do you now a portable way ( my be a library ) that provides dual or multiple monitors with OpenGL ( ARB must think about ). I'm developing a aplication ( a server ) that must map some events in some monitors ( at the moment 4 ) at same time. It's a sercurity system and must be on full screen not windowed.

Thanks

DelNeto

V-man
07-22-2002, 04:34 PM
You should say what platforms you want to port to. There are cards that have multiple outputs to monitors. And then there is the case of multiple video cards, each with one or more outputs.

I don't know if the later is resolved on Windows yet. I don't have any PCI cards either so I can never test.

V-man

DelNeto
07-22-2002, 06:26 PM
OK thanks for replay.

My test base are:

First one.
AGP SLOT - Nvidia GeForce 4 MX460.
This card is dual but i need to test in not dual card way.

Second one.

PCI SLOT - Nvidia TNT 2 MODEL 64.

Main target OSs - Windows and Linux.

Korval
07-22-2002, 08:44 PM
In order to have OpenGL rendered across multiple monitors supplied by multiple cards, you'd have to fall back to software rendering in Windows. Unfortunately, the OpenGL ICD model does not allow a program to choose which ICD to bind a particular render context to. Because each OpenGL implementation (GeForce4 and TNT2) is different, you can't actually choose which one you use. The primary card is the one that is used in this case; the other card is not and cannot be used by your program for OpenGL purposes under Windows.

Asgard
07-22-2002, 10:15 PM
Originally posted by Korval:
Unfortunately, the OpenGL ICD model does not allow a program to choose which ICD to bind a particular render context to.

Have you tested this? I thought it should at least be possible to render to a window with its own context on one card and to a second window with different context on the second card.
Since for example DescribePixelFormats accepts an HDC as parameter, it should be possible to get the pixel formats supported by each adapter, create a window on each of the adapters, set the pixel format and create an appropriate context.
Or is it only possible to have one ICD installed in the system and that's why it doesn't work?

Regards.

knackered
07-22-2002, 10:36 PM
Originally posted by Asgard:
Or is it only possible to have one ICD installed in the system and that's why it doesn't work?

That's correct.
I assume it's because when your app loads opengl32.dll at runtime, that dll queries the primary adapter for opengl support in its attach-to-process section in DllMain. I would imagine it would be simple for Microsoft to sort this out, but I believe *only* Microsoft could sort this out - and they ain't gonna.

Asgard
07-22-2002, 11:31 PM
Hm, that's not good. I wish MS would improve WGL a bit...not only regarding multiple adapter/head support.

Anyhow, I'm very interested in using multiple adapters for an application I'm working on. How can you use multiple adapters under Linux? Say I have two different graphic cards from two different vendors. Can that work under X? I guess there would at least be a problem with having two GLX extension modules...but I'm not a Linux expert, so maybe it can work without problems?
And if it does work, how do you tell the X server to create a window on a specific adapter?

Cheers.

opla
07-23-2002, 01:39 AM
I had some problems using OpenGL on multiple monitor as well (with different graphic cards):
- Win2K (maybe NT4 and XP too): OpenGL will work only if the window is on the primary monitor ("use this device as the primary monitor" in display properties).
- Win98: multiple monitor doesn't work at all, in 2 monitors are in use, no ICD is available. You have to disable one monitor to have ICD.
- Linux: I never tried to use OpenGL on it, but I know it's possible to export the display to another computer. Maybe you can export the display to the same computer, but on another monitor ... The nVidia Final Fnatasy RT demo was using a cluster of Linux computers (with tile rendering I think).

[This message has been edited by opla (edited 07-23-2002).]

ChrisBond
07-23-2002, 03:58 AM
Hmmm... interesting, because I've been doing debug the last few weeks with our new portable with NVidias NView (GEForce4ToGo) that supports "extend my desktop to this monitor". I'll have Visual Studio on the second monitor and the OpenGL Application on the primary - never tried the other way around (because one of our concerns is the LCD being viewable under harsh lighting conditions)... I'm on XP, as soon as I get the portable back I'll let you know if I can swap!

I'd be surprised if MS ever updated the WGL libraries... wouldn't hold my breath. Maybe try using some of the libraries that let you send OpenGL commands to another machine via TCP/IP?
-Chris

opla
07-23-2002, 04:27 AM
ChrisBond, that might work on the laptop, because it's the same video card. The desktop is just expanded.
It also work with GeForce4 cards with 2 outputs: in the display properties, you just see ONE large desktop.

bpeers
07-23-2002, 11:55 PM
Some interesting stuff..
Isn't it "normal" that you can only have one ICD ? In the scenario where you have two different cards working together to form one big desktop, what would/should happen if you drag a windowed OGL app to the other monitor ? I don't see how you can 'port' the RC with everything it got to the new ICD ? :\ How does D3D handle this ?..

opla
07-24-2002, 12:28 AM
what would/should happen if you drag a windowed OGL app to the other monitor ? the window isn't refresh anymore. Nothing is drawing inside.

Nakoruru
07-24-2002, 05:19 AM
Has anyone ever tried to get multimon working by linking to the opengl drivers at runtime?

This could be a solution as long as you were able to make it easy for the customer to find and select which driver to use on the non-primary monitors (the primary monitor's ICD location is stored in a registry key I think)

I think that you would need to make sure that SetPixelFormat, SwapBuffers and friends are loaded out of driver and not from GDI. There are details on how to do this in the FAQ on this site.

EDIT: I just realized that to implement this idea, you would need a library similar to the one I wrote where I could switch between multiple sets of OpenGL functions quickly. I use it to switch between debug wrappers which checked all calls for errors and unwrapped versions, but it could be extended it to swap between two different ICDs easily enough.

Each window would already have its own RC, so you would only have to call a function to switch the function pointers out.


[This message has been edited by Nakoruru (edited 07-24-2002).]

V-man
07-24-2002, 07:45 AM
Originally posted by bpeers:
Some interesting stuff..
Isn't it "normal" that you can only have one ICD ? In the scenario where you have two different cards working together to form one big desktop, what would/should happen if you drag a windowed OGL app to the other monitor ? I don't see how you can 'port' the RC with everything it got to the new ICD ? :\ How does D3D handle this ?..

The main idea of multiple monitors is to have a larger view of your scene. So normally u are suppose to create one fullscreen window per monitor-card.

I'm pretty sure that the ICD can communicate its information to the second card and have a second OGL window created on the other cards. To me this doesn't sound like a worth while idea due to the "main idea" I mentioned above.
For GDI graphics, some resources will be wasted but it's not complicated like for ogl or d3d. Plus, it looks neat.

V-man

DelNeto
08-06-2002, 07:11 AM
Hi, and thanks for all. For thus that use Windows 2000 just try the OpenGL text savescreen, it's doing what i want. How ?

Thanks

DelNeto

Nakoruru
08-06-2002, 07:53 AM
I could not get 3D Text to work (On this computer I have a TNT and an S3), but I got 3D Flower Box to work. Apparently it creates a window the size of the entire desktop and uses Microsoft's software implementation to draw. You will not be able to get hardware acceleration to work this way however.

I think that 3D Pipes creates two seperate windows, but I would have to look at the source to see (the source for Pipes is available in MSDN, 3D flowerbox should be available as well)

vshader
08-09-2002, 08:35 AM
if anyone is interested, apparently this is implemented for Apple. i don't know from experience, but i was reading about it in the mail archive on the Linux openGL ABI page - http://www.opengl.org/discussion_boards/ubb/frown.gif sorry no link. it's on the SGI site.

anyway, under AGL you can request a multi-monitor context - the AGL implementation then intercepts a lot of the GL calls and generally does jiggery-pokery so you can drag a GL window between monitors.

i think it even sorts out which extensions are implemented by all cards on the system, and only returns those from glGetString(GL_EXTENSIONS). this does mean there is an extra layer between your app and the GL driver, though. apparently it is possible to get pointers directly to the driver functions for higher performance.

however, they decided not to allow this possibility for Linux - if you are interested, i think the discussion was provoked by the question of whether extension function pointers returned by glXGetProcAddress() should be context-dependent like they are from wglGetProcAddress(). there seems to be no reason in theory why such a multi-monitor scheme couldn't be implemented on Windows because extension pointers are context-dependent and RCs take an HDC parameter... but i don't expect MS to bother.

V-man
08-09-2002, 09:17 AM
MS won't change anything about GL for sure. Why are there still remarks about MS and gl. GL 2 should offer an alternative to the ICD mechanism.

Whether it will.. I don't know since I haven't gone deep in the docs on that.

V-man

vshader
08-10-2002, 06:52 AM
V-man - i seem to remember the GL2 docs say something about moving all the Context related and buffer-swapping code into the core GL - the way they specified it it looked like the funcs will still be called wgl, glX, agl etc, just they are part of the GL2 spec not part of an additional library.

ha- just found this:



5 Standardization of Non-Core Features
Certain operations that are fundamental to OpenGL were left to be defined as part of the
window-specific “glue” libraries for different operating environments. This has made it harder to
specify these operations and has hampered application portability.
We propose that certain key operations be defined as part of the OpenGL 2.0 specification, even
if they have window-specific bindings. These operations include:
• CreateContext
• GetCurrentContext
• GetCurrentDC/Drawable
• MakeCurrent
• StreamSwapBuffers
• AsyncSwapBuffers
Certain requirements should also be defined in the OpenGL specification for creating a drawing
surface (window) and choosing the format of attached buffers. These changes would give
architectural control of key features back to the OpenGL ARB.


and this:



Synchronization
A number of new functions to support synchronization and increased parallelism are defined in
the white paper Asynchronous OpenGL. These functions are required as part of Pure OpenGL
2.0.
SyncAlloc, SyncFree, SyncWait, SyncWaitMultiple,
wglConvertSyncToEvent/wglConvertEventToSync (and something
equivalent for GLX), FlushStream, Fence, QueryFence,
StreamSelect, StreamCopyContext, QueryFenceBackground,
wgl/glxGetVerticalBlankPulse, FenceWait,
wgl/glxGetVerticalBlankRate, wgl/glxGetVerticalBlankCount,
StreamSwapBuffers, wgl/glxAsyncSwapBuffers AllocateStreamFence,
InsertStreamFence, StreamFenceWait, DeleteStreamFence,
TriggerFence


also WGL_ARB_pixel_format is listed as "add" under the extensions lists - so it becomes core openGL too.
so requesting a multi-monitor context could be standardized - also triple-buffering maybe?

do you reckon it is possible to implement this kind of GL2 system using something similar to the current ICD approach? or will MS have to change it if they want to support GL2?



[This message has been edited by vshader (edited 08-10-2002).]

V-man
08-10-2002, 10:39 AM
Glad to hear the decision has been taken. I don't have that sheet on my drive right now. Gone have to download them all sometime.

I just hope that everything will be regarded as being core, and not extensions. I'm not sure what this talk is about extensions with names like GL2_xxxxxxx.

I'm sure that it is possible to add on capabilities to whatever this ICD thing does. Of course, everyone will have to update their drivers.

Nice page about ICD & MCD http://www.leadtek.com/icdmcd.htm

I haven't seen anything about the details. Microsoft's version seems to have been removed.

Any clue as to what will happen to all the wgl functions? There's a whole set of them called I3D. It's all about dealing with buffers.


V-man

vshader
08-10-2002, 11:31 AM
i was just reading ARB meeting notes and the GL2 exts are a good thing - they will come out BEFORE GL2 is finalized, allowing us to use bits of GL2 from within openGL
1.2/1.3/1.4/1.5.

the new 3Dlabs boards will possibly implement the GL2_program_object, GL2_vertex_shader and GL2_fragment_shader exts as they already have an implementation of these up and running it seems.

i would hope the GL2 exts disappear from pure GL2. but the situation during transition might be something like how most of anything over openGL1.1 is exposed thru ARB extensions on win32, rather than a proper 1.2+ implementation.

my worry is that GL2 will be implemented on windows in practice just as a bunch of GL2 extensions - ie layered on top of the existing lame wgl stuff. the new "core" GL2 funcs may just be calling the old lame wgl stuff for anything that involves interoperation with the window system.

or how's this - your 3d cards' openGL2.0 drivers implement the "core" GL2 window/context creation routines with DirectX calls to allocate window-system-friendly surfaces, implement triple buffering, etc!

btw, didn't MS once claim a win32 implementation of openGL1.2 was coming in win2k service pack 2?

V-man
08-11-2002, 04:00 AM
I thought that drivers already used ddraw to create frame buffers. This seems to occur on all windows platforms. I suppose there is some advantage for using ddraw, maybe it has good memory management routines.

Yes, it was in 2001 when SP2 was to be released. Actually, they said the same for SP1 and the orginal release. SP3 is out now.

Check under fluff articles http://www.opengl.org/developers/faqs/technical/mslinks.htm

V-man

vshader
08-11-2002, 04:38 AM
v-man -



I thought that drivers already used ddraw to create frame buffers. This seems to occur on all windows platforms. I suppose there is some advantage for using ddraw, maybe it has good memory management routines.


i thought about it a bit after i wrote it and i'm sure you're right. i remember reading something like that in the NV VAR spec. i wonder does this have anything to do with the warning not to mix ddraw and ogl in the same app? now that dddraw is integrated into d3d it's just kinda funny that ogl uses it, thats all.

the real issue i'm worried about, and it's a bit OT for this thread now, is will OGL2 work fine on win32 with no cooperation from MS? (i don't know anything about writing gl drivers so i don't know if it will be a problem.) or will CAD app support force MS to go along... but is there anything in OGL2 that is useful for CAD apps? (over current 1.1 + exts).

btw, that link has been removed from MS site.

Nakoruru
08-12-2002, 10:44 AM
DelNeto,

Are you sure that 3D Text, Flowerbox, Pipes, and Flying Objects are implemented using OpenGL in Windows 2000?

I just did some checking and it seems that they have been ported to Direct3D for Windows XP. I was wondering how they all seemed to have multimonitor support built into them. Maybe the Window 2000 versions are implemented using Direct3D as well. If they have a button in setup called 'display settings' which lets you set up each monitor then they are Direct3D.

V-man
08-12-2002, 12:48 PM
lol, I forgot about the port to d3d. Always check with Depends or an equivalent tool.

I don't think that CAD apps will benifit much from 1.2 and above.

The most important thing for CAD is drawing lines (preferable AA lines) and the buffer region feature can be very helpful. Perhaps ogl 2 can take over ddraw functions? Video memory management is very imporant. For the case of modeling tool, some help from the driver is essential. The window may be broken into sections (rectangles) and buffer copied in an optimized way. That is one of the tricks they use.

Other than that its all about being pretty with textures and doing tricks with vertex programs. Something that a few modelling & rendering packages need and have implemented. Those companies will jump at ogl 2

V-man

Robbo
08-12-2002, 02:20 PM
We use both Direct Draw surfaces and GL RCs in the same application and a couple of weeks ago tested it across multiple monitors. We used a GF4MX on AGP and a PCI ATI Rage XL for the second card.

Some odd things were noted (of course). Firstly, if you created the application on the GeForce everything was fine until you dragged the window onto the second monitor (the ATI card), when it stopped rendering (same with the DD surfaces). Visa-versa was true also.

I see no reason why it should opaquely allow this kind of behaviour anyway - just seems dumb to me and far too complicated to get right in every instance. The solution we will recommend to our customers is to follow simple `rules' when creating windows in the application (and possibly helping them along with some UI code) - its a lot easier than rewriting drivers to get it to work.

I guess the new Matrox card can do this stuff ok anyway because it has 2 ramdacs - can support up to 4 monitors (I think). Even so, dual monitors is best done on a card with 2 heads.

DelNeto
08-14-2002, 07:07 AM
Nakoruru,


Thanks, and yes it is really OpenGL. I just remove all OpenGL / Glut ( .dll ) and they stoped to work.


Thanks,

DelNeto