Double rendering for Oculus Rift help

I am writing a program to render a 360 image to an oculus rift (which just acts like an external monitor) and then use the sensor data from the rift to enable you to look around and see different parts of the image ect. I got to the point where I can do all of this but only once, and I need to duplicate the view on the other half of the display (so that I can see it through both eyes). I tried creating two windows and having it render to each one but it would only render the image on the window I created second. What would be the best/simplest way to render two identical views next to each other simultaneously?

Thanks

Not really sure since I haven’t gotten a Rift yet myself, but I seem to recall reading something a while back that said you needed to create and manage two different OpenGL contexts and cameras. The views also aren’t really identical. Just like a pair of eyes, there is a slight offset between the two views, but I’m assuming you know that. At any rate, there’s got to be a ton of examples online.

The Oculus acts as a simple, very wide monitor. You create a fullscreen window. You render the left eye view to the left half of the screen and the right eye
view to the right half.

The images for the eyes need to have some barell distortion applied, to counteract the lenses built into the device. Please consult the documentation. There
are already sample shaders and explanations on the internet on the effect.

cool thanks guys