Can I list used extensions automatically?

I am creating an application which uses a lot of different techniques (FBO, shaders etc…). Is there a way to find out what extensions that are required to run the application?

I suppose this list would look slightly different depending on whether it’s being run on nvidia or ATI.

I am using Glew.

If you write an application this is absolutely up to you which extensions you use. So I see no point in your question…

Ok, to clearify a bit…

Let’s say that I write an application that uses a fairly recent feature, like FBOs or even VBOs. How do I know exactly which extensions are required for using a feature like FBO? I use GLEW and therefore I don’t need to bind all the required extensions explicitly.

For instance it wouldn’t work on a very old GeForce 3 card of course, and I would notice that only when I try to run the application. The question is how can I come up with a list of hardware requirements?

Is there a way to see or log all the extensions that are being asked for when running a certain app?

If your application wants to use an extension, then it should call glewIsSupported(“extension”). If not, then that’s programming error and you should fix it first.
If you use glewIsSupported, then could just search for all occurences of this call in your code.

Another solution would be to download parse glext.h, searching for all #define directives ("#define GL_") and function declarations ("#define <functionName> GLEW_GET_FUN"), and see if any of these tokens/functions occur in your code. Such script/spplication shouldn’t be too difficult to produce.

Ok, I thought that you want to make sure you haven’t forgotten to check for extension that you referred in application (like, you have glUseProgramObjectARB in your program somewhere, but you forgot to check for ARB_shader_objects), but after reading your second post I see, that you know what functions/tokens you use, but you don’t know how these extensions are named?
Just look at glew.h - each extension is in separate section - you can see it’s name in comment at beginning of every section and if it’s core functionality then you’ll see what version of OpenGL to require.

Ok. Thanks. So there is no simple tool or anything that I can use for this purpose? Something like a opengl-debugger or something?

There are some tools like glintercept that can write all OpenGL operations to log file, but it will report only these functions you used when you runned your application and not all extensions that your application can use. If you have your application constructed such way, that it always uses all extensions then you can try it. Still, searching for extensions in log file would probably be no differnt than searching for extensions in your source code.

Ok, so basically extensions can be recognized by the ending ARB/EXT/NV/ATI etc… You can find all functions/tokens with such endings and google them… Usually the first entry is the extension specification. Keep in mind that while there are many extensions, only few of them are important. FBO is for example one single extension GL_EXT_framebuffer_object. You will be most likely using extensions like VBO (GL_ARB_vertex_buffer_object), PBO (pixel) and only several more. As you get more experience, you can regognize the functions instantly, just like you can recognize C functions from different modules.
Also, extensiosn are usually integrated to the core at some point (as it happened to VBO and shaders). If a function or token has no ending, it is a core function.
Still, the best way is to pay attention while you are writing your code. When you want to use an extension, read it spec in the registry - then you will also know which functions and tokens it uses.

Originally posted by k_szczech:
There are some tools like glintercept that can write all OpenGL operations to log file, but it will report only these functions you used when you runned your application and not all extensions that your application can use. If you have your application constructed such way, that it always uses all extensions then you can try it. Still, searching for extensions in log file would probably be no differnt than searching for extensions in your source code.
You can use a plugin to GLIntercept that will give you a function call summary of all OpenGL calls made to the gliLog.txt file on shutdown.

Simply add:

FunctionStats = (“GLFuncStats\GLFuncStats.dll”);

To the “Plugins” section of the gliConfig.ini file.

Example output:
======= OpenGL function call statistics ==========
Total GL calls: 14417
Number of frames: 341 Average: 39 calls/frame (excluding first frame count of 1150)

======= OpenGL function calls by call count ==========
glVertex2f … 4092
glUseProgram … 1364
glTexCoord2f … 1364
glBegin … 1023
glEnd … 1023
wglGetProcAddress … 737
glBindFramebufferEXT … 684
glBindTexture … 684
glColor3f … 682
glClear … 682
wglGetCurrentDC … 378
wglGetCurrentContext … 348
wglSwapBuffers … 341
glDisable … 341
glEnable … 341
glGetString … 258
wglGetExtensionsStringARB … 30
glAttachShader … 4
glShaderSource … 4
glCompileShader … 4
glGetShaderInfoLog … 4
glCreateShader … 4
glGetError … 3
wglChoosePixelFormat … 2
glCreateProgram … 2
glMatrixMode … 2
glViewport … 2
glLinkProgram … 2
wglMakeCurrent … 1
glCheckFramebufferStatusEXT … 1
glTranslatef … 1
glGenTextures … 1
glGetIntegerv … 1
glTexImage2D … 1
wglSetPixelFormat … 1
glFramebufferTexture2DEXT … 1
glGenFramebuffersEXT … 1
wglDescribePixelFormat … 1
wglCreateContext … 1
glMultMatrixd … 1

======= OpenGL function calls by name ==========
glAttachShader … 4
glBegin … 1023
glBindFramebufferEXT … 684
glBindTexture … 684
glCheckFramebufferStatusEXT … 1
glClear … 682
glColor3f … 682
glCompileShader … 4
glCreateProgram … 2
glCreateShader … 4
glDisable … 341
glEnable … 341
glEnd … 1023
glFramebufferTexture2DEXT … 1
glGenFramebuffersEXT … 1
glGenTextures … 1
glGetError … 3
glGetIntegerv … 1
glGetShaderInfoLog … 4
glGetString … 258
glLinkProgram … 2
glMatrixMode … 2
glMultMatrixd … 1
glShaderSource … 4
glTexCoord2f … 1364
glTexImage2D … 1
glTranslatef … 1
glUseProgram … 1364
glVertex2f … 4092
glViewport … 2
wglChoosePixelFormat … 2
wglCreateContext … 1
wglDescribePixelFormat … 1
wglGetCurrentContext … 348
wglGetCurrentDC … 378
wglGetExtensionsStringARB … 30
wglGetProcAddress … 737
wglMakeCurrent … 1
wglSetPixelFormat … 1
wglSwapBuffers … 341

sqrt[-1]: Thanks for the tip. May be very usefull someday.

So… in the list above I could, for instance, try to google on wglGetExtensionsStringARB and hopefully find the name of the extension that needs to be supported. That’s neat. But then doing so would probably generate a long(?) list of extensions.

Let’s say I want to write a hardware requirement specification to tell the users exactly what is needed to run a certain app. I have my long list of needed extensions. How do I know which of these extensions are worth mentioning, and which are generally supported on 99% of modern graphics hardware? Only through experience?

Great help so far. Thank you!

You can look at delphi3d.net under hardware reports.

Depending on what hardware you want to support, you can probably just say: Needs OpenGL 2.0 + FBO.

Most users will not know what you are talking about, so you can say:

Radeon 9550 + or Geforce FX/6/7/8 (or better) with recent drivers.

The above specs will give you OpenGL 2.0 with FBO.
(If you want to get detailed, then yes look up the function entry points)

I suppose so.

I have been working with OpenGL for many years, but I still find the extension-system rather complex and hard to get my hands around.

I am creating an application which is supposed to run on both desktops and portables. What would you say are the most common drawbacks with graphics hardware on laptops? Obviously performance is generally not as good as on desktops, but in terms of features?

Let’s say I buy a recent GeForce xxxxx and a laptop with GeForce Go xxxxx. What features are typically missing on the Go card?

Even though it is quite hard to buy a non-sufficient desktop graphics card today, there is a higher risk that something important is missing if you instead buy a laptop.

From what I understand, the Geforce Go series have the same features, just slower.

Check out:
http://www.delphi3d.net/hardware/viewreport.php?report=1493
http://www.delphi3d.net/hardware/viewreport.php?report=1619

You will probably have few problems with Nvidia/ATI. Most of your problems will come from integrated chipsets. (if you want to support them)

The advantage of the Geforce Go or any other laptop GPU is that they consume less power. To do that, they could use a smaller trace, less transistors, lower clock rate and lower voltage.

The drivers are uniform in features. They share code base. Makes it easier to support all those GPUs varieties.

I have also noticed that there are cases where high-performance cards lack some extensions that are featured on consumer level cards (comparing Quadro cards with GeForce for instance). You would think that those expensive professional cards would be capable of doing at least what consumer level cards can do. I don’t have any recent example, but in the past there were Quadro cards that didn’t support palletted texture when the GeForce cards did (even when the Quadro card was newer than the GeForce card).

Therefore it’s not always 100% waterproof to say that the minimum requirement is (for instance) GeForce 6800. But then again, that was some time ago. I haven’t checked the delphi3d lists for quite some time now.

By the way, is it true that say a GeForce 6800 GT supports the same extensions as a 6800 GS and XT etc? In other words, do they only difer in speed and memory capacity?

Generally yes, all GeForces 6x00 share the same capabilities (but not always; IIRC the 6200 models don’t have floating point texture filtering). But the general thrend is that each “product line” has the same features, only vastly different performance.

Palletted textures dropped out of support around the Geforce FX era, so if you were comparing a pre Geforce FX card with a Quadro Fx +, you would see that. (all consumer Geforces don’t support Palletted textures anymore)

True, Palletted texture is outdated.

Let’s try an example here…

Deep down in a RenderTexture class I find the line:

_pixelFormatAttribs.push_back(WGL_DRAW_TO_PBUFFER_ARB);

(not coded by me)

Let’s say I know that the application is depending on the existence of this attribute. It’s not true, but just for the sake of it.

How do I get from WGL_DRAW_TO_PBUFFER_ARB to a written hardware requirement?

Sorry if I tire you out. I hope that I (and others) can learn something here.