glViewPort and Framebuffer

Hi all!
I had an “error” that I put some times to “understand”.
In fact, I tried to create a second camera to look at my main camera. This camera was just for debbuging a viewing frustum culling.
What I did was : create a glViewport(0,0,screenWidth, screenHeight), render my main camera. Then create another glViewport(screenWidth/2, screenHeight/2, screenWidth/2, screenHeight/2) and render my second camera in it.
The result was nearly great, except that the rendering of my second camera operated only in a screen that was the quarter of the glViewport, and the remaining part was black.
so if I schematize :


 __________________
|        |    |  C|
|        |    |___|
|        |  B     |
|        |________|
|                 |
|   A             |
|                 |
|_________________|

A represent the part in which my main camera was rendered, B represent a part that was black, and C the part on which my second camera was rendered.
After some tries, I used glViewport(0,0,screenWidth/2, screenHeight/2) and it worked well. No more black part. It seems that it is because I was doing offscreen rendering using FBO.
My question is : Can someone explains to me what happened? Why did I have this strange behavior?

Thanks!

Based on your previous description, B should have been where your second camera was rendered, to the above statement is surprising. You might post a picture.

Also make sure you don’t have any fragment tests enabled that might be “clipping away” parts of your viewport rendering (e.g. GL_SCISSOR_TEST).