wglChoosePixelFormat for WGL_ARB_render_texture

I’ve been a bit perplexed with wglChoosePixelFormatARB and how it chooses pixelformats when WGL_BIND_TO_TEXTURE_RGBA_ARB is specified. I’ve used wglChoosePixelFormatARB succesfully to set up offscreen render & renders to textures as well as setting up float images. So I’m confident all the prior code is correct. Here’s an example of my current confusion, this set of attribs gives me a valid pixelformat:

int iAttributes[MAX_ATTRIBS*2] = {
WGL_DRAW_TO_PBUFFER_ARB, true,
WGL_BIND_TO_TEXTURE_RGBA_ARB, true,
WGL_TEXTURE_FORMAT_ARB,WGL_TEXTURE_RGBA_ARB,
WGL_TEXTURE_TARGET_ARB, WGL_TEXTURE_2D_ARB,
0,0};

if I’m less specific (as in not specifying the texture target format) it fails… In that case I’d had expected to at least find the same formats as in the more specific case:

int iAttributes[MAX_ATTRIBS*2] = {
WGL_DRAW_TO_PBUFFER_ARB, true,
WGL_BIND_TO_TEXTURE_RGBA_ARB, true,
WGL_TEXTURE_FORMAT_ARB,WGL_TEXTURE_RGBA_ARB,
0,0};

I’d never actualy want to specify it wihout the target but I do want to specify the color depth, if I add the color depth it fails to:

int iAttributes[MAX_ATTRIBS*2] = {
WGL_DRAW_TO_PBUFFER_ARB, true,
WGL_BIND_TO_TEXTURE_RGBA_ARB, true,
WGL_TEXTURE_FORMAT_ARB,WGL_TEXTURE_RGBA_ARB,
WGL_TEXTURE_TARGET_ARB, WGL_TEXTURE_2D_ARB,
WGL_RED_BITS_ARB, 8,
WGL_GREEN_BITS_ARB, 8,
WGL_BLUE_BITS_ARB, 8,
WGL_ALPHA_BITS_ARB, 8,
0,0};

My ultimate goal is to render to a float image pbuffer… by trial and error I’ve figured out that I should not use WGL_FLOAT_COMPONENTS_NV with WGL_BIND_TO_TEXTURE_RGBA_ARB which seems strange since you must specify WGL_FLOAT_COMPONENTS_NV to get 16bit per channel formats. And curiously enough if I specify WGL_TEXTURE_FORMAT_ARB as WGL_TEXTURE_FLOAT_RGBA_NV then I can specify a color depth to differentiate between 16 & 32bit floats…

Is there any logic to this that I’m missing?

Ah! I’ve always passed the same attributes used for wglChoosePixelFormatARB to
wglCreatePbufferARB when they in fact are 2 different sets