Compatibility verification

Hi,

  1. How can I find the minimum OpenGL version required to run my app?

Is there any utility that can intercept all the gl calls while my app is running, and log the minimum required OpenGL version for each call? For example:

Function | OpenGL version required

glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) | 1.5
glEnable(GL_SCISSOR_TEST) | 1.3

  1. Is there any way I could tell my OpenGL 3.3 card to “behave” like, say, an 1.3 card? That is, how can I “downgrade” my OpenGL support in my card? This is for testing compatibility, of course.

Thanks,
Gil.

Theoretically you could use WGL_ARB_create_context and ask for 1.3 context, but afaik implementation is free to give you anything that is superset of such context and i dont think any vendor cares to give you exactly that. In effect, no, i dont think its possible to emulate older contexts precisely.

To get rough estimate of your requirements you could make GLIntercept logs of your application (im guessing its too big for you to just know them) and parse function names against those in spec files, its pretty straightforward to write such script.

The GL2.1 man pages usually have the GL version required for each function. If it’s absent, the function is GL 1.0.

It isn’t enough to list all the functions, OpenGL version may differ also by function arguments. I.e. the same function that works in 1.3 will not work with a different set of arguments (e.g. glBlendFunc).

I was trying to avoid the sisyphean work, looks like no escape?

Thank you,
Gil.

In the glBlendFunc example, you’ll find that the parameter enums that aren’t native to GL1.0 also have the version of their introduction into OpenGL listed in the “Notes” section.

While the man pages will provide this information, you’re still going to have to assert() in a wrapper function for each GL function to check that you aren’t exceeding a testing-target OpenGL version. If you haven’t wrapped the OpenGL functions to begin with, it might just be easier to test on older hardware or drivers.

It’s much easier to just parse the trace from GLIntercept. It tells you what enums you’re using.

Granted, what’s restricted by version are enum+function combinations. But you can at least pick out hot-spots (glTexImage*, glBlendFunc, etc) and write special logic to check for these. The hardest thing is coming up with the actual list of what was legal for which version.

In answer to the second question, if you are trying to test how your app will look on different cards you can fake the OpenGL version and extension string with GLIntercept (extension override plugin). (This assumes that you check the appropiate strings and versions before using the features - it does not actually disable features)

In answer to the first question on finding the min OpenGL version needed, download and use GLIntercept. Then, go to the install directory and you will find a GLFunctions\gliIncludes.h file.
Delete out some extensions then re-run and compare the logs against the previous run.

If the logs are the same, you know you did not use any of the deleted extensions. If they are different, you know that you made use of the version/extension - and add it back to the list.

Keep doing this until you have a min list of OpenGL versions and extensions needed.

(This will obviously not catch extensions that do not log anything - like GLSL extensions or non-power of two extension etc)

GLIntercept can also be configured to give you a summuary log of all OpenGL functions called, and how many times they were called. (the function stats plugin) Use this as a starting point to lookup what functions belong to what extensions.

Actually, given more thought - if this is your app - can’t you just start hacking away at your Opengl include header file to exclude extension sections (assuming C/C++) and if you fail to compile, you know that extension is used?

There are more to extensions than just using new functions and enums. The hard part is detecting something like NPOT usage, or the usage of different enums for glBlendFunc or glBlendEquation. Enums that aren’t necessarily defined by that particular extension. There are probably 3-4 different extensions that define and use the GL_ADD enum.

Or you can get Mesa3D (find the older versions) and run your code and check for glGetError(). Of course, compiling that thing on Windows is a big PITA.
It is a software emulator. Very slow.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.