video buffer question

I would like to read in B&W video at 30fps, impose a calculated color at each of the 1 bits, and then add the result onto an existing video output buffer. I am curious to know if this can be done with existing PC + videocard + software. I would be very
grateful for any suggestions.

Hi,
You probably should follow these steps:

  • convert each frame from monochrome to RGBA (best done using MMX assembly).
  • blend that with some other data in the video buffer.
  • you can use the texture_rectangle extension to display the two data sources, or you should pad the edges if you stick to normal textures.

Tzupy,

Thanks very much for responding to my query. I would love to get some more detail about what you have in mind; could you expand somewhat on your comments? I think by ‘convert’ you just mean to impose a color palette on the B&W, but I don’t quite know what hardware you are assuming; is this framegrabber input, firewire, usb, what? Also, I’m not sure what you mean by ‘blend’; I want to logically OR the new frame with the one that is feeding the output buffer; how is this done? What is the texture_rectangle extension part of? All advice and help gratefully anticipated! Thanks!

Roger

Originally posted by Tzupy:
[b]Hi,
You probably should follow these steps:

  • convert each frame from monochrome to RGBA (best done using MMX assembly).
  • blend that with some other data in the video buffer.
  • you can use the texture_rectangle extension to display the two data sources, or you should pad the edges if you stick to normal textures.[/b]

Hi,
It seems that your problems are wider than I expected.
I assumed that you already have the video stream as a file and are able to access the data in each frame. I don’t know how to aquire the video from OpenGL, maybe some Windows SDK (image aquisition) can do that.
Under OpenGL blending is a standard way to combine the existing color buffer with new color data. The texture rectangle extension is used to display non-power-of-two textures, like the frames of a video stream.
OpenGL doesn’t properly display B&W images, so you have to convert them to grayscale or RGBA. If you want to blend, then the source and destination must have the same color types (best RGBA and RGBA or BGRA and BGRA).
I think you should read about the stuff above in the Microsoft Platform SDK.