gldGetTextureLevel

Hi folks,

I am profiling my OpenGL app on a new MacBook Pro (NVidia 9600, OSX 10.5.5), using the Sampler application, and it seems that my app uses 20% of its time in the function gldGetTextureLevel(). Does anybody have an idea about what this function is?

Cheers

Some internal driver function. It’s almost certainly not the actual culprit anyway, due to symbol stripping.

Use the OpenGL profiler and OpenGL driver monitor for detailed information about where your application is spending OpenGL time.

For general profiling needs, Shark is much better than Sampler.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.