View Full Version : Live video and texture2D

01-04-2009, 03:34 PM

My application uploads live video (720*576 PAL, interlaced) to a texture which is applied on a screen sized quad. I used texture2D instead of texture2DRect because I thought since there is non-power-of-2 support, why bother.

It worked quite well and I thought that everything was OK. However, when my live video was a news channel with the usual right to left scrolling text at the bottom, the scrolling movement was definitely less smooth than the original.

Is it possible that by using texture coordinates between 0 and 1, there is a rounding error when accessing the texture? If you had a similar experience, do you think I can do anything to avoid it with a texture2D or is the only solution to use video as a texture2DRect and texture coordinates based on its pixel dimensions?

The reason why I am asking if anyone had this problem before is that it is not at all trivial to change from texture2D to texture2DRect. A lot of work would be involved and it would be very helpful for me to know if it is worth the effort.


01-04-2009, 04:58 PM
men just for curisity, how are you streaming video of that size. I've worked with FFmpeg and the conversion from yuv420p to RGB it's very sow. Are you using a special library?

Rosario Leonardi
01-04-2009, 05:44 PM
I'm using GL_TEXTURE_RECTANGLE_ARB... but I never try to view a news video. :-\
What did you means for smooth? It's slower?
Did you also stream the audio? The audio is synchronized with the video?

Did you use standard texture or PBO?
For a PBO tutorial look here:

@RGHP: My library (with FFmpeg) run smooth at 800x600 (on my quadcode) and I still have to make a lot of optimization (I have to add PBO support).
Actually I'm sending the pictures with a simple glTexSubImage2D

void VideoTexture::active()
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, m_textureName);
//copy the buffer in the texture
glTexSubImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, 0, 0, m_width, m_height, GL_RGB, GL_UNSIGNED_BYTE, (GLvoid*)m_buffer);
m_changed = false;
in the decode function (on another thread) I simply decode the frame on the m_buffer and set the m_changed flag on true.

// Decode video frame
avcodec_decode_video(pCodecCtx, pFrame, &frameFinished, packet.data, packet.size);
// Did we get a video frame?
// Convert the image from its native format to RGB
sws_scale(instance->m_imageConvert, pFrame->data, pFrame->linesize,
0, instance->m_height, data, stride);
instance->m_changed = true;
I also want to try to decode the image with a shader.

I also tried with TEXTURE_2D and I didn't found any difference (apart the UV coord)

When I finish if it works good I can post it on a opensource repository. :)

01-04-2009, 05:53 PM
This is live video, not compressed. I upload it as 24-bit RGB using alternating pbos.

Not smooth means there are very small jumps in playback (unnoticeable in almost anything but the continuous scrolling I described). This has absolutely nothing to do with the upload which works perfectly.

01-04-2009, 10:58 PM
This jumps could be visible because of refresh rate difference. Your monitor is 60Hz but source is 25i. If you can render at full speed (60 FPS) some frames from source would be rendered 3 times per second and all other 2 times per second.

Do you use any filtering? If you use for example linear filtering interlaced image would be fiteret and you can see ghosts on screen, very noticable on horizontal scrolls. Also any interlaced image downscale would have artifacts.

btw.. how do you grab live video? Using some TV card? Any DirectShow stuff involved? If yes, did you fix sync source?

01-05-2009, 03:18 AM

No, the output is 50Hz, I do not sync at all to the VGA monitor. I read back from the GPU at 50fps (twice the required speed), interlace then transfer the finished image back to the video card. I use various cards, all of them broadcast quality capture boards, e.g. the Decklink from Blackmagic. The whole rendering is synced top this video card and not the GPU.

I tried every filtering method. I cannot notice any difference between nearest and linear, at least not when video is applied on a screen sized quad (which I think is logical: neither minification nor magnification takes place).

Once again, I am quite certain there is no sync problem. All other stuff I render with opengl than read back looks perfectly smooth in movement.

(BTW Directshow: I asked for help about avi playback using Directshow a few months ago. I remember that I got a very good tip from you then and it is working perfectly. Thanks once again.)

01-05-2009, 06:48 AM
the output is 50Hz, I do not sync at all to the VGA monitor
You mean your vga monitor is at 50hz ?
If not, it should.
And you should vsync too.

Anything else will show slight time jerkiness, especially visible on scrollings.

01-05-2009, 06:59 AM
Hmm, I think I did not describe the situation very well. I also have an output on the VGA but it is only for preview. I do not even care about how it looks. The real output appears on the video card, and the render is synced to that.

On this video card, playback is fine and smooth. Any graphics the application itself creates and renders using opengl, even continuously scrolling elements move perfectly fine. If I display the live video that is being captured from the input of the card, upload it as a texture and apply it on a full screen quad, I should see the video input without any change in any of the pixels on the output. But I don't. The change is hardly noticeable, but it is definitely there, because the scrolling text of a news channel shows it. I suspect it is due to rounding error because of normalized texture coordinates and I wanted to know if there is a remedy or can I expect a texture2dRect to solve this problem.

01-05-2009, 08:39 AM
> rounding error because of normalized texture coordinates
Very unlikely.
Generate 1 pixel-wide black/white checkerboard to the output, you will see if there are coordinates problems.

Screenshot please.

01-05-2009, 11:25 AM
OK, I will try the checkerboard although it will be impossible to check visually. Graphics like that as an interlaced video signal (i.e. TV) will look like one big flash.

01-19-2009, 10:44 PM
Thanks for the tip and making me suspect that I was wrong in the first place. I tried the checkerboard, and it looked perfect. Finally I found out that there were not one but two problems:

An ordinary bug that made me think it was a texture coordinate thing instead of the real culprit, a shader I made to improve hardware antialiasing.