I have a working fragment shader for translating YUV 420 YCbCr Biplanar iPhone 4 camera frames to RGB. I was able to read enough samples, and the YUV page at Wikipedia to get what I needed:
When you say why the math, do you mean why Y is scaled, and U and V are offset by 0.5? The Y is being scaled there because the code is assuming that video levels are being used, where 16 is picture black (so ‘super-black’ is supported), and picture white is 235. That math (I didn’t check the numbers, but you can) converts to a more normal 0.0 - 1.0. Depending on what your write the result to, it will be clamped.
U and V are color difference signals, so they represent a signed number with a range of approx -0.5 to 0.5. It’s most common to offset the signed value to make an unsigned integer in the 0-255 range, representing -127 to 127. The math converts it back to a signed floating point difference value.
That completely answers my question! I recall at some point reading about video frames being 16 to 235, but I didn’t put 2 and 2 together. Thanks so much!
Acha! 16/255 = 0.0625. So, if you have a value of 16, then 1.1643 * 16 - 0.0625 = 0. Conversely, 235/256 = 0.91796 and (0.91796 * 1.1643) - 0.0625 ~= 1.