Stereo issues with high number of vertices

Hi,

I’m working on an OpenGL point cloud renderer for big vertex sets and on our target platform I ran into problems I cannot explain.

I implemented a Level of Detail algorithm, segmenting a point cloud via octree and creating a vertex array for each node. Since the density in the data is varying a lot, some nodes have more than 2 million vertices still, most have far less, more like 200k. Now, I’m rendering about 100 vertex arrays (GL_POINTS), via GL_ARRAY_BUFFER. I enabled GL_PROGRAM_POINT_SIZE, to control my point size in the vertex shader dynamically, with respect to LOD level and viewer distance.

Now I have a very weird flickering in stereo mode. (I’m using freeglut to initialize my window with GLUT_STEREO flag set and in my display callback I simply draw into GL_BACK_LEFT and GL_BACK_RIGHT with different projection matrices passed to the shader respectively.) It seems like for some frames not all vertex arrays are drawn. I don’t know what can cause this, it just looks like the frame is not completely drawn, there are definitely octants missing in my octree. Sometimes it even seems like they are not missing but even shifted a bit, as if they get transformed incorrectly. This happens frequently, about twice a second on both left and right frames [1]. Since in mono mode everything is perfectly fine, and everything is single-threaded, I think I can eliminate any problems with my LOD implementation. Also, the LOD part is well tested. After some time (pretty much randomly) it works again, mostly when I logout/login, until, after some time the glitch returns. Also, I get driver crashes sometimes, with Error 3 or 8 returned.

Our setup is a powerwall with a resolution of 6936 x 4096, therefore we have four 4k projectors and we’re using mosaic mode with edge blending. We are running 4 Quadro M6000 on one machine, connected by Quadro Sync. Driver is version 348.27 on Windows 8.1, but I had this problem with all versions I tested so far.

Any ideas what could be the cause of this effect? I realy ran out of ideas. Any hints would be highly appreciated.

Thanks a ton,
Roland

[1] https://dl.dropboxusercontent.com/u/4655442/WP_20150801_102337Z.mp4

In the absence of a “smoking gun”, I would just start disabling things until the problem goes away to try and narrow it down.

How sure are you that your frame times (CPU submit + GPU draw + SwapBuffers overhead) are below the VSync time? What is the behavior of your system when a draw time overrun occurs? Should the GPU just scan out the last complete frame rendered to either the left and/or right eye view?

Thanks for your reply, Dark Photon.

Yeah, that’s what I most likely will have to do… unfortunately, time is running out, you know…

According to my fps counter, it is approximately the refresh rate of the projectors (120 Hz) when v-sync is turned on, depending on the complexity of the content currently to display. It may be hardly below sometimes. I have to add that this is my very first OpenGL stereo application, so I have no experience at all with those factors, and how I can influcence them, but shouldn’t the behaviour in case of a dropping framerate be that the projectors just repeat the last full frame? Isn’t this exactly the same than what happens in mono mode no a regular monitor? Anyways, I cannot reproduce the behaviour by intentionally forcing a framerate of, say, 45 fps by adding complexity, so I think this isn’t the issue, but I may be wrong?

That’s what I would think, but I have no experience with G-Sync or Quadros, so don’t want to make any assumption.

Isn’t this exactly the same than what happens in mono mode no a regular monitor?

Yes, exactly.