Testing OpenGL applications

I was wondering if any of you have any experience on testing OpenGL applications, if there are areas one need to pay special attention to. Any books on the subject, leassons learned or other ways on how to test OpenGL applications in a professional and effective way.

I assume you’re thinking about something like unit tests for OpenGL apps. I’ve thought about it some time ago but never got around to seriously looking into it.

One problem is that the spec never guarantees pixel-perfect results, so results on different platforms (or different drivers) can be different. You would need a tolerant comparison function to evaluate the success of a test, or design tests that are relatively immune to minor changes (e.g. only compare the interior of an object and stay away from the border). You might also have a problem running them in a window, as windows can be obsacured etc. FBOs might help, but then you’re not testing the same path as the app, which might give different resutls.

I know SGI’s OpenGL Shader team had some automatic tests that used the hardware driver and compared it to a gold master as well as using Mesa to get around hardware/driver differences.

But I’m not aware of a toolkit/examples that would integrate nicely into an existing unit test framework. I’d be very interested if somebody had something like this!

If anyone have any knowledge on automatic testing on OpenGL it would be really nice. But i am also looking for guides, books and processes on how to do manual test in the best possible way.

http://www.gremedy.com/products.php
?!
or if you will seriously debug your 3D application - just use Direct3D if you can, of course
( Pix for windows, NVIDIA NVPerfHUD, Shader Debuggers );

I am looking into gremedy…

It looks to be a nice tool.
I am working as a QA on a project which is now expanding to make a 3D version of our tool. So i am also looking for procedures on how to test 3D applications. If there is any known ways on how to do this. Also if there is any ways to do this in conjunction with Agile development.

One quick suggestion that has helped me in the past is to do opengl state-checks in all functions that do gl-calls (there are various utilities that will help you do this). This is helpful when debugging cross-hardware issues - but I guess you could limit these by doing a full test of state-equality to begin with. This will not test situations where different drivers react differently to the same state, but it will catch differences in default state.

Oh, and use the OpenGL-errorchecking functions aggressively! :slight_smile:

Self promotion:
Might I suggest taking a look at http://glintercept.nutty.org/ ?

GLIntercept can save the OpenGL function calls to a log file (text or XML). This you can use to compare two different runs or different hardware.

It can also save textures and the frame buffer after every render call. (So you can diff and compare the output of different cards)

Other things like shaders and display list are saved. (GL error checking can also be done automatically)

GLIntercept is also config file based so you can batch run tests. (source is also available for custom stuff)

There are plugins for things like shader debugging and free cam and may other debug utils.

Email me if you have any questions/problems using it.

Although you can’t rely on pixel-precise results across different graphics cards/drivers, you can do some sort of statistical measurement of the differences and decide what is “significant”. ImageMagick’s “compare” tool has a “-metric” switch that lets you use various statistical methods. All of those are just generic methods which are notoriously inaccurate at measuring how the human visual system actually perceives differences, but they are better than nothing. The ideal would be a “perceptual difference metric” that takes into account human cognition factors etc. Unfortunately I don’t know of any free or reasonably priced tools for doing so.