Enumerating formats and other such things in GL

In my previous OpenGL projects I’ve always made assumptions about the availability of various capabilities, having the program throw errors if they’re not available. Now I find myself having to write a GL pathway for an existing D3D code-base and I’m struggling to work out how I can fit in with that scheme by providing the equivalent of “caps” for GL. That is to say, I want to provide an implementation of the following:

/* Virtual */
void
Adaptors3dOpenGLWin32::EnumerateInternal()
{
	// Call base enumerator to fetch available devices.

	Adaptors3dWin32::EnumerateInternal();

	// Enumerate all pixel formats.

	EnumerateAllPixelFormats();

	// Now enumerate all front/back buffer combinations.

	EnumerateBackFrontBufferCombinations();

	// Enumerate all depth/stencil formats for each combination.

	EnumerateDepthStencilFormats();

	// Enumerate all multi-sample types.

	EnumerateMultiSampleTypes();

	// Enumerate all map formats (texture formats).

	EnumerateMapFormats();

	// Enumerate capabilities.

	EnumerateCapabilities();
}

I’ve asked this question in various forms on stackoverflow, but I never seem to get a coherent answer! I’m aware there are some “take it for granted this is supported” capabilities for various GL versions, but I’m not sure how to write a coherent set of functions that will give me the data I need.

Assuming I have an abstract interface, and abstract identifiers for things like R8G8B8A8, etc., how can I populate a data structure with capability details for GL?

If my question is dumb and I need a totally different approach, that would also be a valid answer :p.

Thanks.

I’m aware there are some “take it for granted this is supported” capabilities for various GL versions, but I’m not sure how to write a coherent set of functions that will give me the data I need.

In many cases, it simply doesn’t exist. You can use WGL_ARB_pixel_format to get most of the pixel format information. But the rest simply is not available.

OpenGL does not provide a way to ask if a particular image format is supported. You can ask for any image format, and the OpenGL implementation is required to provide you with that format or something close to it. The OpenGL specifications 3.0 or greater have a table of required image formats, which the implementation must provide exactly as is. But that’s about it. There’s no API to query these things.

Thanks Alfonse. In my view this is a huge problem for OpenGL. “something close enough”, or “software fall-back” is an anathema when you’re trying to present a consistent interface to the user. I’m a bit shocked that the ARB, with both 3 and 4, haven’t decided to provide something like this.

I may ditch the GL pathway entirely and stick with D3D.

I actually just read this in the FAQ:

“Another way to check for hardware acceleration is to temporarily remove or rename the ICD, so it can’t be loaded. If performance drops, it means you were hardware accelerated before.”

Sure, I can recommend that to our customers :p.

In my view this is a huge problem for OpenGL. “something close enough”, or “software fall-back” is an anathema when you’re trying to present a consistent interface to the user.

In practice, it’s really not. For most applications, you will not be exposing texture formats to the user directly. So which texture formats exactly are supported and which are not is irrelevant to the user.

In general, you code for particular hardware. And hardware generally is built around OpenGL/D3D versions. OpenGL version 2.1 is the equivalent of D3D9. OpenGL 3.0-3.3 is the equivalent of D3D10. OpenGL 4.0+ is the equivalent of D3D11. So there’s not much in terms of “capabilities” to query. There are a few extensions to get D3D10.1 (aka: all ATI D3D10 cards) functionality, but that’s about it.

So if you get an OpenGL implementation version 3.3, you know what features you have. If you get 4.1, you know what features you have. If you get 2.1, you know what features you have. For the most part.

Also, I never said anything about “software fallback.” There’s a big difference between software rendering and getting a GL_RGB8 texture when you asked for a GL_RGB4 format. Software rendering kills your performance, while the most the image format thing will do is increase your memory usage.

Or you could just stick to the list of required GL 3.0+ formats and be done with it. It’s not exactly a short list.

I’m not sure what kind of interface you’re trying to provide the user where you’re exposing details like what image formats are available, but the lack of such query mechanisms in OpenGL hasn’t stopped anyone else from writing code that supports both GL and D3D.

I actually just read this in the FAQ:

The FAQ is kinda old. The best way to check for hardware acceleration is to look at the vendor string. If it has “Microsoft” in it, then it isn’t hardware accelerated.

“I’m not sure what kind of interface you’re trying to provide the user where you’re exposing details like what image formats are available”

What I want to do is provide a data structure that the caller can query to find out whether or not what he wants is supported (this is a framework, rather than an end product). With D3D this is easy. In most cases he wants a specific front/back buffer combination and a specific depth/stencil buffer format. Alongside the latter, he wants to know what multi-sample levels are supported for each format, if at all, etc.

Other capabilities that should be enumerable are the texture formats he can use for 2d textures, 3d textures, cube maps and volumetric textures (if any are supported). I should also be able to tell him whether he can use shader 3, 4 or 5 as well.

At the moment this comes as a tree that the caller can “prune” (discard all 16 bit buffer formats, for example). What results after pruning is something you can present in a UI, allowing the user to selected a video-mode, refresh rate, AA level and so on.

It seems such a simple thing… but there’s no obvious way to do it with GL and in my view there should be. Validating the hardware capabilities is something of a prerequisite for absolutely everything else.

For what it’s worth I agree with you, but historically the OpenGL philosophy has been to abstract details of the hardware and just provide you with a software interface. This means that you really have no means of determining - directly through OpenGL - what pixel formats and/or capabilities are present (and which ones may lead to possible software emulation); in theory you just write code to a specific GL_VERSION and the driver does the Right Thing behind the scenes. In other words: “don’t sweat the hardware details, trust the driver, everything will be OK.”

Now there are advantages to this approach (as anyone who’s ever discovered the joy that is D3DERR_DEVICELOST can testify) but the biggest disadvantage to me is that you - as the developer - are not in control. You’re overly reliant on the quality of the driver. This IMO has led to the current poor quality of many OpenGL drivers and the preponderance of vendor-specific workarounds in many OpenGL apps.

The only really practical solution I can think of is to spend some time getting to know the various hardware that’s currently out there. Know it’s quirks and it’s limitations, test on as many hardware variations as possible, and don’t commit to doing anything any specific way until you’ve reasonable certainty that it’s going to work on a good proportion of your target hardware.

In most cases he wants a specific front/back buffer combination and a specific depth/stencil buffer format. Alongside the latter, he wants to know what multi-sample levels are supported for each format, if at all, etc.

You didn’t seem to see it the first time, so allow me to repeat: WGL_ARB_pixel_format.

Other capabilities that should be enumerable are the texture formats he can use for 2d textures, 3d textures, cube maps and volumetric textures (if any are supported).

This simply doesn’t make much sense for OpenGL. You should just return all of the possible image format enumerators that are available for the particular GL version. The specification has a list of these formats.

Remember: OpenGL is not allowed to punt on image formats. If you ask for a texture with an image format of GL_RGB4, the implementation must give something to you. It may really be GL_RGB8 behind the scenes, and you can test what you actually get, but most of the time, it’s not that important.

However, if you really, really want to know which formats are actually used by the implementation, you can create proxy textures (not just regular textures, but the GL_PROXY_TEXTURE texture types). Create it with an image format, then query what it’s internal format actually is. That will tell you whether it used exactly what you told it to, or whether it adjusted the format to something else.

But really, you could just return the list I directed you to previously.

I should also be able to tell him whether he can use shader 3, 4 or 5 as well.

It’s at this point that I should remind you that OpenGL and D3D are different. If you want to make a single interface where the user can talk to them both without knowing anything about which one he’s using, you’re going to have to write your own shader language. They do not use the same shading language, so you cannot pretend on any level that they do. Not unless you make a language that compiles to GLSL and HLSL (though you could use CG for this). At which point, you would have to ask about what version of your made-up language to use.

Note that you can query the accepted GLSL version from a GL implementation. But GLSL versions aren’t the same thing as D3D Shader Models. And the user cannot treat them as identical.

What results after pruning is something you can present in a UI, allowing the user to selected a video-mode, refresh rate, AA level and so on.

And what does this have to do with texture formats and shader versions? You’re not going to present the user with a list of texture formats to pick from. You’re not going to present the user with a list of shader versions to choose.

These are two distinct concepts you’re talking about. Framebuffer properties are very different from dealing with implementation details.

“If you want to make a single interface where the user can talk to them both without knowing anything about which one he’s using, you’re going to have to write your own shader language”

Oh no, that’s not the intention. He’ll create an instance of the abstract interface with either D3D or GL instance factory. The idea is to abstract as much as possible to make porting easier, not to abstract absolutely everything, so GL shaders will be needed if he’s chosen the GL path and HLSL shaders if he’s chosen the D3D path.

Actually I’ve just noticed WGL_SAMPLE_BUFFERS_ARB and WGL_SAMPLES_ARB, so that can be used to find out if sampling is available for the given buffer format as well.

So that’s the solution: enumerate formats with WGL_ARB_pixel_format (is that different to wglChoosePixelFormatARB?), that will include back buffer/depth stencil/multisample information. Then fill in the structure with the appropriate stock “required” formats for things like textures/render targets and then anything else, like shader model, I can get by looking at the appropriate extension.

The only issue I’ll still have problem with is how to enumerate the available modes (resolutions/refresh-rates) and how to set them up. Presumably I can do this with ChangeDisplaySettings, enumerate devices in the same way.

Then fill in the structure with the appropriate stock “required” formats for things like textures/render targets

Since you’re building an abstraction, you should consider taking the OpenGL model and applying it to D3D. That is, the user can just ask for whatever format from either API, and you give them the closest thing to what they ask for.

After all, that’s what they will have to do anyway. If they want an RGB4 texture but the hardware doesn’t support it, they’ll have to convert it to an RGB8 texture. This way, you do the work for them and make your abstraction easier to use.