Adjusting vsync through code and general vsync question
I am working on a system that will involve splitting a world across multiple monitors and trying to sync these monitors up the best I can.
Since I am only rendering 2d, and since these machines are all located adjacent to each other on a high speed network connection, I am attempting to use a brute force approach involving a master sending sync signals to all the slaves. Sync signal would tell slaves to swap buffers once render was ready. This sync signal can hopefully approach 30-60 fps and then everything is fine.
I have been doing some reading and I believe if I have vsync enabled then there could potentially be an issue. Despite the speed of the network connection, if one machine is pausing due to a screen refresh, it could slow down the screen refreshes. I am aware of hardware based solutions such as the Quadro link from NVIDIA to solve this. I was wondering if there is a method to force an initial beginning of vsync in an attempt to get the monitors lined up every couple seconds. Kind of like a group of guys resetting their clocks together. Is this possible, or is vsync regulated purely by hardware?
Also the more I think about vsync the less I really know.
Is the timing a function of the monitors refresh rate and refresh time?
If I purchased monitors that had extremely fast refresh times (not rates) then could I minimize the issue caused by vsync?
How does the graphics card sync up with the refresh timing of the monitor?
Finally if I wanted to control vsync myself through code, is there a signal that I can capture within openGL?