View Full Version : Swapping green channels on graphics card output

09-02-2009, 06:12 AM
Hello everyone,

for a university project I need to display the green channel of output 1 of the graphic card on output 2 and vice versa (on a NVidia GeForce 8600 GT). We've already tried to use a component cable and simply swap the green cable, but this introduces synchronisation problems we were not able to solve, so now we're trying to find a software solution. Is there a way to grab the screen's contents, swap the green channels and display the altered content? Since I'm quite new to OpenGL programming, every help will be very much appreciated.

Thanks in advance and best regards,

Bruce Wheaton
09-02-2009, 09:32 AM
Your easiest answer is to use a dual head graphics card with both outputs set to the same res. Then use two adaptors to get VGA (not component) and switch the greens - you should be fine.

In OpenGL, it would have to be a program you control - I don't think you can do it externally, but then you could switch color channels with a shader. You would either need to draw each screen to a buffer (FBO) then access both to pick the right channels, or draw the scene twice on each output - once for green, once for red & blue.


09-03-2009, 04:33 AM
Hello Bruce,

thanks for your reply.
The NVidia GeForce 8600 GT is a dual head graphic card and both outputs are set to the same resolution (1024 x 768 @ 75 Hz). The component cable we've used simply takes the signals from the dvi2vga connector and splits them up to red, green, blue, h.sync and v.sync on bnc connectors, so we already use vga output. Both outputs on the graphic card seem to have their own signal clock, even when using the same resolution, and the NVidia driver unfortunately doesn't allow you to sync both outputs.

The software solution we are looking for should "simply" take the picture that is shown from both outputs, including overlays, mouse pointer and everything, then swap the green channels and display the altered content on the according screen/output. I've read about FBOs and PBOs and I think this might be a good approach to the solution, but how can I access the screen content and grab the pixels without using an invisible window and glReadPixels() so I don't have to leave the GPU's memory? And how can I display them back on the according output? Any help will be much appreciated.

Thanks in advance and best regards,

09-03-2009, 08:42 AM
Sorry but capturing everything including the mouse pointer and overlay seem not possible...

To get the big picture, do you try to do passive stereo with 2 LCD projectors ?

09-03-2009, 09:26 AM
@ZbuffeR: Yes, we do. We are actually trying to avoid loss of light due to the use of linear polarization filters, since LCD projectors already emit linearly polarized light. The problem is, red and blue are vertically polarized, green horizontally. So what we want to do is to put a retarder foil (broadband lambda/4 retarder, itos wp 140he to be exact) in front of the projectors lenses to generate circular polarization. But thanks to the different polarization directions, the green channel would spin in the opposite direction. That is why we need to swap the green channels of the projectors. If this is not possible, we would have to use regular circular polarization filters, which consist of a linear polarization filter and a retarder foil and that would cost about 30% brightness. Is there a way to get the screen content from the back buffer, manipulate it and write it to the front buffer or something like that? Thanks in advance.

09-04-2009, 05:15 AM
The NVidia GeForce 8600 GT is a dual head graphic card and both outputs are set to the same resolution (1024 x 768 @ 75 Hz)
In fact I am quite surprised that both outputs are not synchronized.
Is there any difference between these mode, regarding the sync :
- clone (even if unusable for your application) ?
- span (strong chance to work) ?
- dual view ?
And what about the different 3D optimization modes, single monitor and multi head or whatever ?
Are both card output of the same type vga/vga or dvi/dvi ? Or is that dvi/vga ?

09-04-2009, 09:19 AM
All modes have their own sync, but every mode introduces a different offset, so regardless what mode we use the outputs are not synced. 3D optimization modes seem to have no effect on the sync whatsoever. Both card outputs are dvi, the same dvi2vga adaptors and the same vga2component cables are applied.
We are thinking about "hacking" the TMDS data streams and build a hardware solution with microcontrollers, but that's kind of tricky :-) .
If anyone has any ideas how we can achieve a green channel swap in OpenGL or a similar framework, it will be much appreciated.