How to make 10bit grayscale display?

hi all,
The input mammography image of our software platform is 12bit. But We develop our MammoCAD software by Mircosoft MFC and windows API in visual studio 6.0 platform. So we only support 8bit display.
Now I want to know how to develop the software output 10bit display.
I write these code:

PIXELFORMATDESCRIPTOR pixelDesc;
pixelDesc.nSize = sizeof(PIXELFORMATDESCRIPTOR);
pixelDesc.nVersion = 1;
pixelDesc.dwFlags = PFD_DRAW_TO_WINDOW |
	PFD_SUPPORT_OPENGL |
	PFD_DOUBLEBUFFER;		
pixelDesc.iPixelType = PFD_TYPE_RGBA;
pixelDesc.cColorBits = 32;
pixelDesc.cRedBits = 10;
pixelDesc.cRedShift = 20;
pixelDesc.cGreenBits = 10;
pixelDesc.cGreenShift = 12;
pixelDesc.cBlueBits = 10;
pixelDesc.cBlueShift = 2;
pixelDesc.cAlphaBits = 2;
pixelDesc.cAlphaShift = 0;
pixelDesc.cAccumBits = 0;
pixelDesc.cAccumRedBits = 0;
pixelDesc.cAccumGreenBits = 0;
pixelDesc.cAccumBlueBits = 0;
pixelDesc.cAccumAlphaBits = 0;
pixelDesc.cDepthBits = 24;
pixelDesc.cStencilBits = 0;
pixelDesc.cAuxBuffers = 0;
pixelDesc.iLayerType = PFD_MAIN_PLANE;
pixelDesc.bReserved = 0;
pixelDesc.dwLayerMask = 0;
pixelDesc.dwVisibleMask = 0;
pixelDesc.dwDamageMask = 0;
m_GLPixelIndex = ChoosePixelFormat( hDC, &pixelDesc);
if (m_GLPixelIndex==0) // Let's choose a default index.
{
	m_GLPixelIndex = 1;
	if (DescribePixelFormat(hDC, m_GLPixelIndex,
		sizeof(PIXELFORMATDESCRIPTOR), &pixelDesc)==0)
	{
		return FALSE;
	}
}
if (SetPixelFormat( hDC, m_GLPixelIndex, &pixelDesc)==FALSE)
{
	return FALSE;
}
return TRUE;

And I make:
m_hGLContext = wglCreateContext(hDC);
What should I do next? How can I test if I have realize the 10bit output?

Well, video cards able to do 10bits per color are already very rare.
And screens are even rarer, most LCD are not even fully 8bits/color …

How to test ? getpixels on a gradient, and check that the progression of values has indeed 10bits precision.

hello ZbuffeR,
Thank you for your reply.
I have ATI special card and 5M monitor.

How can I get pixels on a gradient? Does my code up writted have already make 10bit output display? Could you tell more details? Thank you!

sorry I meant readpixels :
http://www.opengl.org/sdk/docs/man/xhtml/glReadPixels.xml

Draw first a quad, with black on one side, and dark gray on the other side. That way you can see how precise is the interpolation.

Accelerated 3D rendering is very different from 2D. So unless ATI have special extensions to work on a 10bit framebuffer, it will probably not work.

Maybe someone else has already experimented this ?

Hi Sunrain,
I would suggest not ChoosePixelFormat. Instead you can use wglChoosePixelFormat. This method is better supported since it is a direct interface into the OpenGL driver (instead of going through the OS like ChoosPixelFormat does) If you specify only the attributes you care the most about, you are more likely to find a matching format.

Here is a code snippet:
HDC device_context; // set previous to this code
int attribs[64] = {
WGL_SUPPORT_OPENGL_ARB, TRUE,
WGL_DRAW_TO_WINDOW_ARB, TRUE,
WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
WGL_RED_BITS_ARB, 10,
WGL_GREEN_BITS_ARB, 10,
WGL_BLUE_BITS_ARB, 10,
WGL_DEPTH_BITS_ARB, 24,
WGL_DOUBLE_BUFFER_ARB, TRUE,
0, // zero terminates the list
};

static float fattribs[64] = {
    0.0f, // zero terminates the list
};

const int format_max = 256;
int formats[format_max];
unsigned int format_count;
wglChoosePixelFormatARB(
    device_context,
    attribs, fattribs,
format_max, formats, &format_count);
// "format_count" suitable formats are in "formats".
if(format_count == 0) {
    // there were no suitable formats
} else {
    // For this example we just use the first match.
    result = SetPixelFormat(device_context, formats[0], &pfd);
    if(result == 0) {
        // error can be retrieved from GetLastError()
    }
}

On the latest FireGL\FirePro cards you should have support for generic 10b rendering. First enable it under the workstation tab of the ATI Catalyst Control Center. Then you should be able to select a 10b format using the above code. Remember that you will only get 10b output if you have a native 10b display (or a packed-pixel gray-scale monitor).

To test that you are rendering to a 10b framebuffer, you can either query the color depth using glGetIntegerv(GL_RED_BITS, &nAttribs); etc or by querying the pixel format directly using DescribePixelFormat. You can also test the actual result of your rendering using glReadPixels.
To test that your output is in full 10b, you can draw a shallow color gradient quad. Something like gray 0x100 on one side and 0x101 on the other. This would produce 2 shades on an 8b monitor and 4 on a 10b monitor.
Remember that your data should be in an appropriate format to get 10b output. A texture format of either FLOAT or UNSIGNED_INT_2_10_10_10_REV would work best.

Thanks all!
The method offerd by gnosis looks good. I will try it.

hi all,
How can I use the function “wglChoosePixelFormatARB()” above mentioned.
When I compile the program, appear undefined error.