View Full Version : Displaying YUV:411 (NV12) texture with OpenGL

10-23-2009, 01:24 PM
I have a YUV:411 (aka NV12) frame data, how can I get OpenGL to recognize its format for glTexImage2D() rendering.

I realize that I might have to perform colorspace transformation, but is there a way that I can tell openGl to command the graphic hardware to perform the colorspace transformation?

I am programming in Cocoa/Objective-c++ and the only YUV texture support that I can find is GL_APPLE_ycbcr_422 which is YUV:422 and not the format that I currently have.

Thanks a lot for all your help.

10-23-2009, 04:54 PM
According to www.fourcc.org (http://www.fourcc.org) NV12 is YUV:420 not 411.
Also.. according to MSDN http://msdn.microsoft.com/en-us/library/ms797894.aspx

A format in which all Y samples are found first in memory as an array of unsigned with a larger schar with an even number of lines (possibly tride for memory alignment). This is followed immediately by an array of unsigned char containing interleaved Cb and Cr samples. If these samples are addressed as a little-endian WORD type, Cb would be in the least significant bits and Cr would be in the most significant bits with the same total stride as the Y samples. NV12 is the preferred 4:2:0 pixel format.

Also read this page to understand sampling:

What you have to do is to upload your YUV420 image as RGB or RGBA texture or split Y, U and V in three separate texture (choose which format suits better). Note that such texture still containg YUV values.. not RGB. You have to write fragment shader to convert YUV to RGB. So.. Create FBO and attach empty RGB texture. Then bind your YUV texture('s) and render screen aligned quad. In fragment shader you have to sample Y, U and V from proper texture coordinates convert to RGB and output final result. After that texture in FBO will contain RGB version of original YUV420 texture.

10-26-2009, 08:21 AM

I might have made a mistake with the name convention. I just checked fourcc.org and you are right.

In response to your reply:

I was wondering if there is a way that I can tell OpenGL to make the color space conversion.

I know how to make the proper YUV:420 to RGB conversion. I already have this done, but right now I would like OpenGL to take my NV12 data and inform the hardware to make the conversion.

So do you know anyways in which I can invoke the hardware for this?

10-26-2009, 11:03 AM
No... There is no way to do that. You have to do that on CPU or using shaders on GPU.

10-26-2009, 11:56 AM
yooyo, you seem to be an expert on this subject can you provide me with some resources or some quick examples how I can use the CPU or shaders from the GPU?

I am fairly new to OpenGL and I am not too sure how I would go about doing what you described.