Fast pixel change detection

Hi everyone,

I’m trying to implement a really fast method of pixel change detection.
This has two main parts: capturing and comparing screenshots.

For capturing:
I’m using PBO so that glReadPixels() will return after the call.
Extract of this part of the code is here:

     if(pboUsed){ //with PBO
	  // OpenGL performs asynch DMA transfer, so glReadPixels() will return immediately.
	  glBindBufferARB(GL_PIXEL_PACK_BUFFER_ARB,pboIds[index]);
	  glReadPixels(posx,posy,500,500, GL_BGR_EXT, GL_UNSIGNED_BYTE,0);// posx and posy are given by the mouse position
              GLubyte* screenshot = (GLubyte*)glMapBufferARB(GL_PIXEL_PACK_BUFFER_ARB,GL_READ_ONLY_ARB);
            //ptr does now point to my screenshot..}

For two screenshots, I’ll basically have two GLubyte* screenshot1 and screenshot2

For comparison :
I tried to use simple and efficient memcmp: memcmp(screenshot1,screenshot2,5005003) but it doesn’t always detect differences.
And I’d like to detect any single pixel color change.

Are there any OGL methods that can help me achieve this ?

Thanks

Why do you want to perform the comparison with the CPU? If you do it on the GPU there is no real need for a capture (you can directly render to a texture through FBO). Also, you don’t have to do a readback or any unnecessary copy, just use a fragment shader to compare the pixel values.

Thanks for your reply.
You are right, it would be great if everything was done on the GPU.
Got the part for rendering to a texture through FBO. But I have questions about the fragment shader part.
Is there anyway to compare pixels that are currently being rendered with pixels from a previous texture (holding the previous frame) using a fragment shader?
Or should the fragment shader have two textures as inputs ?

Sorry, I’m not familiar with fragment shaders so I may need guidance here.

P.S: I also did some research and found these threads: image comparison - OpenGL - Khronos Forums and Comparing two textures in gpu - OpenGL - Khronos Forums
I will try to implement these methods and see which one has the best performance.

This is exactly what you have to do. Render both images to a texture through FBO and then use both textures as input to your comparison fragment shader.

All right, thanks aqnuep. I also have another concern. For the case above, we were picking the whole frame. What if I need to monitor only specific portions of the frame? Let’s say the user can select three different regions or portions of the screen. So here, we would have coordinates (x,y and width, height) of these regions like rectangles in windows programming. Can I use the same approach that is : using one texture with those three portions and same fragment shader ? How would I redirect those selected regions to the texture? And here, I need to be able to tell which specific region had its pixels changed.

Is there anyone who can help?