View Full Version : disable twinview and desktop spanning
06-11-2004, 03:41 PM
Does anyone know how to programmatically set display settings to only use "1 single monitor"??? I have an opengl app that doesn't display correctly if the user is spanning their desktop accross more than one monitor..so I need a way to disable the desktop spanning, and ensure that only 1 primary monitor is being used to display the app. Any ideas??
thanks in advance!
06-12-2004, 11:20 AM
Use GetMonitorInfo API to retrieve all important infos regarding the installed monitors.
06-14-2004, 04:34 AM
Install new nv drivers and do correct setup in driver pages (hw acceleration). OpenGL works fine on both screens.
06-14-2004, 09:14 AM
Yeah...I've checked out the GetMonitorInfo API microsoft, and the GetMetrics stuff, GetDisplaySettings for monitors, but I did not see any information about them allowing me to change or set monitor info. I can ChangeDisplaySettings for a particular device, but it does not allow me to disable a particular device. Say for example if I wanted to enable only 1 device as the primary display, and disable all other display devices detected on the machine. I know their has to be a way...I'm probably just over looking it somewhere....
06-14-2004, 11:58 AM
AFAIK there is no clean way of turning on and off e.g. the secondary monitor.
But I really don't understand your problem:
just retrieve all the infos of the installed monitors and set your window coordinates and viewport coordinates based on that data.
By that you should be able to get rid of your problem, because
- you know what is the primary monitor
- you know the area this monitor displays
I mean, if your app has problems when overlapping to another monitor, why don't you just forbid it? By the way, I agree to yooyo: Never encountered such problems anyway.
Just with some Matrox cards I lost hardware acceleration in those cases. Ah, and of course there is the viewport size limit. For example if your viewport is stretched over let's say 3 monitors, each with a horizontal resolution of 2048 pixels, and your card has a viewport limit of 4096 the remaining 2048 pixels will be undefined.
06-16-2004, 01:53 PM
I'm using two computers to run this app. It's a stereo projection...1 computer per eye. If I attempt to manipulate the code at this point a number of things could/will happen. It creates code complexity, since it makes the clients different - one will be running fullscreen and one in a 1/2 screen borderless window. It would probably also make them run at different speeds and could cause other unforseen stereo issues with refresh rate or not lining up or whatever.
Obviously there has to be a much simpilar method of going about this than that. After all, video card makers do it with the single toggle of a radio button in the advanced display settings. I appreciate the ideas to work around this problem, but let's not make things more difficult than they have to be.
If anyone else has any ideas for programmatically switching between single monitor display and multi-monitor display, I'd really appreciate them.
06-16-2004, 07:21 PM
As you said "...with the single toggle of a radio button in the advanced display settings.".
So it is a driver / card specific setting.
Anyway I remeber I saw some code fragment some time ago that promised to disable a monitor by calling
// init l_dev_mode
// but set width and height to zero to force disabling
Never checked it out though. So assuming it works this way then just loop through all but the primary monitors. Of course the first parameter "\\\\.\\DISPLAYx" has to be modified accordingly.
Powered by vBulletin® Version 4.2.0 Copyright © 2013 vBulletin Solutions, Inc. All rights reserved.