PDA

View Full Version : 10 bit rendering



timman
06-02-2016, 09:09 AM
What is the best way to render 10 bit video using openGL?

I want to read raw 10bit video (P010 fourcc for example) from file, and then to produce true 10 bit output using OpenGL to the 10 bit display device.

What graphics card can I use for this task?

I tried to use sample code from this document "www.nvidia.ru/docs/IO/40049/TB-04701-001_v02_new.pdf" with my GTX 980 Ti card, but the rendering fails with "No 10bpc WGL_ARB_pixel_formats found!" message (10 bit mode was enabled in NVIDIA Control Center).

I tried the samples from here "www.nvidia.com/object/quadro-product-literature.html" (30 bit color sample code) but the same problem occurred (8 bits works fine).

As I understood, the way, described in these documents, is good for Quadro cards, but not fits for others. Is it true? Is there any way to produce 10 bit output, using GTX graphics cards?

Is there any way to render true 10 bit image without using WGL_ARB_pixel_format extension?

Alfonse Reinheart
06-02-2016, 09:48 AM
No 10bpc WGL_ARB_pixel_formats found!

What emitted that error? That sounds like something that a tool like FreeGLUT or GLFW or something would emit. So, what was it?

timman
06-06-2016, 07:53 AM
Sorry for the late reply.

The problem is that wglChoosePixelFormat method returns zero as number of compatible formats. The code is


// Find the 10bpc ARB pixelformat
wglGetExtensionsString = (PFNWGLGETEXTENSIONSSTRINGARBPROC) wglGetProcAddress("wglGetExtensionsStringARB");
if (wglGetExtensionsString == NULL)
{
printf("ERROR: Unable to get wglGetExtensionsStringARB function pointer!\n");
goto Cleanup;
}

const char *szWglExtensions = wglGetExtensionsString(dummyDC);
if (strstr(szWglExtensions, " WGL_ARB_pixel_format ") == NULL)
{
printf("ERROR: WGL_ARB_pixel_format not supported!\n");
goto Cleanup;
}

wglGetPixelFormatAttribiv = (PFNWGLGETPIXELFORMATATTRIBIVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribivARB");
wglGetPixelFormatAttribfv = (PFNWGLGETPIXELFORMATATTRIBFVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribfvARB");
wglChoosePixelFormat = (PFNWGLCHOOSEPIXELFORMATARBPROC) wglGetProcAddress("wglChoosePixelFormatARB");

if ((wglGetPixelFormatAttribfv==NULL)||(wglGetPixelFo rmatAttribiv==NULL)||(wglChoosePixelFormat==NULL))
{
goto Cleanup;
}

int attribsDesired[] = {
WGL_DRAW_TO_WINDOW_ARB, 1,
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
WGL_RED_BITS_ARB, 10,
WGL_GREEN_BITS_ARB, 10,
WGL_BLUE_BITS_ARB, 10,
WGL_ALPHA_BITS_ARB, 2,
WGL_DOUBLE_BUFFER_ARB, 1,
0,0
};

UINT nMatchingFormats;
if (!wglChoosePixelFormat(dummyDC, attribsDesired, NULL, 1, &idx30bit, &nMatchingFormats))
{
printf("ERROR: wglChoosePixelFormat failed!\n");
goto Cleanup;
}

if (nMatchingFormats == 0)
{
printf("ERROR: No 10bpc WGL_ARB_pixel_formats found!\n");
goto Cleanup;
}


It's the method, recommended by NVIDIA and AMD (AMD code is a bit different, but it is the same in general). The documents, I referred in my post are 7 years old.

Is there any modern way to produce 10 bit output, using openGL?

Is there any way to display 10 bit YUV image using openGL?

GClements
06-06-2016, 08:10 AM
Is there any modern way to produce 10 bit output, using openGL?
You need hardware which supports 10-bit output.


Is there any way to display 10 bit YUV image using openGL?
The bit depth of the source data has no bearing upon the bit depth of the output.

If the hardware only supports 8-bit output, you can either convert 10-bit YUV to 8-bit RGB, or convert 10-bit YUV to 10-bit RGB (rendering to a suitable texture attached to a framebuffer object) then convert 10-bit RGB to 8-bit RGB as a post-process (optionally using dithering or error-diffusion).

timman
06-06-2016, 08:52 AM
Thanks for the fast reply!

I have necessary hardware of course. I can view 8 bit - 10 bit difference on my 10 bit display, using Mad Video Renderer for example. And now I'm trying to write my own code, that will render 10 bit picture using openGL. I googled few days and the only solution that I found was WGL_ARB_pixel_format extension (NVIDIA and AMD 2008-09 year recommendation documents). I compile the samples and tried to run them and faced to problems, I described above (wglChoosePixelFormat method returns zero as number of compatible formats).

I'm new to openGL, but I suppose that the other way (without using WGL_ARB_pixel_format extension) should be.

My task is to open 10 bit raw video (or image -- it doesn't matter) and display it on 10 bit display as true 10 bit (with a minimal differences from the original, possibly).

So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read?

I've wrote a code for 8-bit case (it was a simple task), so I'm interested in 10 bit initialization traits.

Dark Photon
06-07-2016, 07:00 PM
I'm new to openGL ... My task is to open 10 bit raw video (or image ...) and display it on 10 bit display as true 10 bit ...

So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read?

Apparently some nVidia Quadro GPUs support this over DisplayPort or Dual-link DVI. For details and code snippets to feed GL context setup, see:

* 30-Bit Color Technology for NVIDIA Quadro (https://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf)

Supposedly some AMD consumer and professional line GPUs do as well:

* AMD’s 10-bit Video Output Technology (https://www.amd.com/Documents/firepro-10-Bit-whitepaper.pdf)

A little more on nVidia's 30-bit display support from the nVidia Linux driver README file:



__________________________________________________ ____________________________

Chapter 31. Configuring Depth 30 Displays
__________________________________________________ ____________________________

This driver release supports X screens with screen depths of 30 bits per pixel
(10 bits per color component). This provides about 1 billion possible colors,
allowing for higher color precision and smoother gradients.

When displaying a depth 30 image, the color data may be dithered to lower bit
depths, depending on the capabilities of the display device and how it is
connected to the GPU. Some devices connected via analog VGA or DisplayPort can
display the full 10 bit range of colors. Devices connected via DVI or HDMI, as
well as laptop internal panels connected via LVDS, will be dithered to 8 or 6
bits per pixel.

To work reliably, depth 30 requires X.Org 7.3 or higher and pixman 0.11.6 or
higher.

In addition to the above software requirements, many X applications and
toolkits do not understand depth 30 visuals as of this writing. Some programs
may work correctly, some may work but display incorrect colors, and some may
simply fail to run. In particular, many OpenGL applications request 8 bits of
alpha when searching for FBConfigs. Since depth 30 visuals have only 2 bits of
alpha, no suitable FBConfigs will be found and such applications will fail to
start.