Counting Fragments w\ Occlusion Query & Atomics

Hi,

I am having some trouble using occlusion query to count fragments being drawn. I want to check that I am using occlusion query correctly by comparing the query result with an atomic counter that I increment manually in a fragment shader.

The occlusion query value starts at 0 and is incremented for each fragment that passes the depth test (I am using the query GL_SAMPLES_PASSED). My atomic counter starts at 0 and gets incremented each time a fragment program executes. With depth-testing and multisampling disabled I expect these values to be the same, but they are not. The query value is consistently greater than the atomic counter.

Before I start looking for errors in my code, I want to understand that OpenGL is doing what I think it should be doing so here is my question: do you know if the fragment shader executes once for each fragment that passes the depth test? (In other words, is there a one-to-one mapping between fragments that pass the depth test and fragment shader executions? I have depth testing disabled.)

Thanks.

How much larger is it? Also, did you use the appropriate barrier before when reading the atomic counter buffer/image?

Also “With depth-testing and multisampling disabled:” how certain are you that multisampling is disabled? Are you rendering to an FBO you created, or to the default framebuffer?

I appreciate your fast reply.

I ran my test program on a different machine and it works as expected. The occlusion query and atomic counter return the same result.

Now I need to check that my drivers and glew are up-to-date on my original machine.

Thanks.

I appreciate your fast reply.

Obviously not that much, since you didn’t answer any of those questions. You may be invoking undefined behavior (thus it “working” on a different machine may just be luck), but there’s no way to know whether it’s that, a code bug, or a driver bug.