10 bit rendering

What is the best way to render 10 bit video using openGL?

I want to read raw 10bit video (P010 fourcc for example) from file, and then to produce true 10 bit output using OpenGL to the 10 bit display device.

What graphics card can I use for this task?

I tried to use sample code from this document “www.nvidia.ru/docs/IO/40049/TB-04701-001_v02_new.pdf” with my GTX 980 Ti card, but the rendering fails with “No 10bpc WGL_ARB_pixel_formats found!” message (10 bit mode was enabled in NVIDIA Control Center).

I tried the samples from here “NVIDIA Professional Visualization Product Literature” (30 bit color sample code) but the same problem occurred (8 bits works fine).

As I understood, the way, described in these documents, is good for Quadro cards, but not fits for others. Is it true? Is there any way to produce 10 bit output, using GTX graphics cards?

Is there any way to render true 10 bit image without using WGL_ARB_pixel_format extension?

No 10bpc WGL_ARB_pixel_formats found!

What emitted that error? That sounds like something that a tool like FreeGLUT or GLFW or something would emit. So, what was it?

Sorry for the late reply.

The problem is that wglChoosePixelFormat method returns zero as number of compatible formats. The code is


 // Find the 10bpc ARB pixelformat
    wglGetExtensionsString = (PFNWGLGETEXTENSIONSSTRINGARBPROC) wglGetProcAddress("wglGetExtensionsStringARB");
    if (wglGetExtensionsString == NULL)
    {
        printf("ERROR: Unable to get wglGetExtensionsStringARB function pointer!
");
        goto Cleanup;
    }

    const char *szWglExtensions = wglGetExtensionsString(dummyDC);
    if (strstr(szWglExtensions, " WGL_ARB_pixel_format ") == NULL) 
    {
        printf("ERROR: WGL_ARB_pixel_format not supported!
");
        goto Cleanup;
    }

    wglGetPixelFormatAttribiv = (PFNWGLGETPIXELFORMATATTRIBIVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribivARB");
    wglGetPixelFormatAttribfv = (PFNWGLGETPIXELFORMATATTRIBFVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribfvARB");
    wglChoosePixelFormat = (PFNWGLCHOOSEPIXELFORMATARBPROC) wglGetProcAddress("wglChoosePixelFormatARB");

	if ((wglGetPixelFormatAttribfv==NULL)||(wglGetPixelFormatAttribiv==NULL)||(wglChoosePixelFormat==NULL))
    {
        goto Cleanup;
    }

    int attribsDesired[] = {
        WGL_DRAW_TO_WINDOW_ARB, 1,
        WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
        WGL_RED_BITS_ARB, 10,
        WGL_GREEN_BITS_ARB, 10,
        WGL_BLUE_BITS_ARB, 10,
        WGL_ALPHA_BITS_ARB, 2,
        WGL_DOUBLE_BUFFER_ARB, 1,
        0,0
    };
    
    UINT nMatchingFormats;
    if (!wglChoosePixelFormat(dummyDC, attribsDesired, NULL, 1, &idx30bit, &nMatchingFormats))
    {
        printf("ERROR: wglChoosePixelFormat failed!
");
        goto Cleanup;
    }

    if (nMatchingFormats == 0)
    {
        printf("ERROR: No 10bpc WGL_ARB_pixel_formats found!
");
        goto Cleanup;
    }

It’s the method, recommended by NVIDIA and AMD (AMD code is a bit different, but it is the same in general). The documents, I referred in my post are 7 years old.

Is there any modern way to produce 10 bit output, using openGL?

Is there any way to display 10 bit YUV image using openGL?

You need hardware which supports 10-bit output.

The bit depth of the source data has no bearing upon the bit depth of the output.

If the hardware only supports 8-bit output, you can either convert 10-bit YUV to 8-bit RGB, or convert 10-bit YUV to 10-bit RGB (rendering to a suitable texture attached to a framebuffer object) then convert 10-bit RGB to 8-bit RGB as a post-process (optionally using dithering or error-diffusion).

Thanks for the fast reply!

I have necessary hardware of course. I can view 8 bit - 10 bit difference on my 10 bit display, using Mad Video Renderer for example. And now I’m trying to write my own code, that will render 10 bit picture using openGL. I googled few days and the only solution that I found was WGL_ARB_pixel_format extension (NVIDIA and AMD 2008-09 year recommendation documents). I compile the samples and tried to run them and faced to problems, I described above (wglChoosePixelFormat method returns zero as number of compatible formats).

I’m new to openGL, but I suppose that the other way (without using WGL_ARB_pixel_format extension) should be.

My task is to open 10 bit raw video (or image – it doesn’t matter) and display it on 10 bit display as true 10 bit (with a minimal differences from the original, possibly).

So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read?

I’ve wrote a code for 8-bit case (it was a simple task), so I’m interested in 10 bit initialization traits.

[QUOTE=timman;1282856]I’m new to openGL … My task is to open 10 bit raw video (or image …) and display it on 10 bit display as true 10 bit …

So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read? [/QUOTE]

Apparently some nVidia Quadro GPUs support this over DisplayPort or Dual-link DVI. For details and code snippets to feed GL context setup, see:

Supposedly some AMD consumer and professional line GPUs do as well:

A little more on nVidia’s 30-bit display support from the nVidia Linux driver README file: