How to display true 10-bit color

Hi, All

I have a Quadro FX1800 video card, i try to output 10-bit color on Display Port,
and also i have a 10bit 1080p video capture to measure it.

My test pattern is 0 to 1023 grayscale, when i wrote a code to test, its range
has some problems, the levels of 1020, 1021, 1022 and 1023 are all mapped to 1020
on my video capture. I also tested the same situation on FX4800, but i got the same
result.

In my case, i use glPixelReads to read frame buffer, and its range is okay.

Please help me to solve its, thanks!

BRs

Did you create an OpenGL context with a pixel format with 10-bits of RGB color?

Are you outputting SDI? Some values are illegal in SDI, in which case you are seeing the correct result.

In fact, ‘black’ and ‘white’ values have special definitions in a video signal, and your greyscale will probably be showing greater than and less than full range, so those missing values will not affect your ability to represent full black and full white.

Bruce

Yes, i create an OpenGL context with 10-bits of RGB color.
The most levels look prefect, only 1021, 1022, and 1023 are all mapped to 1020 on FX1800 and FX4800.

I have been tried this experiment on ATI Firepro V5700, it had full range, but it had incorrect gradient.
For instance,
253 -> 253
254 -> 254
255 -> 255
256 -> 257
257 -> 258 …

this behavior has the same result on video capture system and using glReadPixles

BRs

[QUOTE=Bruce Wheaton;1242726]Are you outputting SDI? Some values are illegal in SDI, in which case you are seeing the correct result.

In fact, ‘black’ and ‘white’ values have special definitions in a video signal, and your greyscale will probably be showing greater than and less than full range, so those missing values will not affect your ability to represent full black and full white.

Bruce[/QUOTE]

My output port is a Display Port, and my video capture input is also a Display Port. I check the
black level which is 0. If it is not a full range and bases on ITU 709 specification, it could be 64(black) to 940(white).

BRs

So ReadPixels is OK, but the digital values you’re getting out are not?

Hmmm. How is gamma correction handled on a digital output… Might check and verify that you’ve got it nailed to 1. (shot-in-the-dark)

Also, tidbit from the NVidia README on 10-bit for Quadros:

When displaying a depth 30 image, the color data may be dithered to lower bit depths, depending on the capabilities of the display device and how it is connected to the GPU. Some devices connected via analog VGA or DisplayPort can display the full 10 bit range of colors. Devices connected via DVI or HDMI, as well as laptop internal panels connected via LVDS, will be dithered to 8 or 6 bits per pixel. …

So, sounds like some expert knowledge required here. 30-bit depends on both the cable(s)/hub(s) connecting it to the display and the display. Google the NVidia boards.

[QUOTE=Dark Photon;1242739]So ReadPixels is OK, but the digital values you’re getting out are not?

Hmmm. How is gamma correction handled on a digital output… Might check and verify that you’ve got it nailed to 1. (shot-in-the-dark)

Also, tidbit from the NVidia README on 10-bit for Quadros:

So, sounds like some expert knowledge required here. 30-bit depends on both the cable(s)/hub(s) connecting it to the display and the display. Google the NVidia boards.[/QUOTE]

Thanks for your reply,
the comment of Nvidia README describes the flat panel, most panels only
have 6bit or 8bit capable, and the front end digital IC of display(monitor) will
use dither algorithm to shrink 10bits to 6 or 8bits, or 8bits to 6bits for low
cost products.

In my case, i require a true 10-bits output, something i have been done for test

  1. use Display Port pattern generator to check video capture system is okay
  2. check capture pattern, the pattern is 0 to 1023 and step is 1, i checked it wasn’t effected by gamma correction
  3. use different 10-bits products, such likes FX1800, FX4800, and Firepro V5700, and their problems were mentioned before

i can transmit full range digital signal from pattern generator with normal Display Port cable, so it couldn’t need to a special cable or hub,
and i check product series of cable, and it only has interface adopter.

Thanks for your suggestions

BRs

You might need to put the monitor/OS into 30bit mode too. How far away are we from 10bpp being common? 6/8bpp does look pretty bad when you have a gradient of a few shades going across more than that many shades of pixels. 10bpp might not help much but it would at least quadruple the shades.

EDITED: Just for fun. I wonder (aloud) what an LCD with a programmable (per pixel) backlight would look like???

[QUOTE=JenLee;1242736]My output port is a Display Port, and my video capture input is also a Display Port. I check the
black level which is 0. If it is not a full range and bases on ITU 709 specification, it could be 64(black) to 940(white).

BRs[/QUOTE]

The black and white levels are a separate issue from prohibited values in an SDI signal. I would hazard a guess that there’s a bug in the Nvidia driver, and it’s disallowing invalid SDI values at the top end, even though your output isn’t SDI. Possibly it’s code that was created for SDI, repurposed to deal with 10-bit DisplayPort.