On the first glance the following question looks silly, but after trying to answer on it I realized it is quite difficult (or even impossible).

How can we estimate execution time of some function inside a shader?

I have made a sample vertex shader like this:
Code :
#version 330
out  float out_val;
void main(void)
{
    out_val = someFun(gl_VertexID * 1e-6);
}
allocate 80MB buffer for transform feedback, embrace glDrawArrays() with glQueryCounter()
Code :
glQueryCounter(m_nStartTimeID, GL_TIMESTAMP);
glDrawArrays(GL_POINTS, first, count);
glQueryCounter(m_nEndTimeID, GL_TIMESTAMP);
and call it for count=1e7.

Can you guess what happens? Elapsed time does not depend on the complexity of the function. There is a fixed portion for setup (about 14.7us on my laptop) and a portion that directly depends on the number of vertices (about 22.5ms for 1e7 vertices).

Does anybody have any suggestion on measuring GLSL function execution time?

In fact, I need to compare efficiency of some implementations. So it is not important to have absolute values. On the other hand, I don't want to measure execution time of the application when they are applied, since it is quite specific and subjected to optimization related to certain implementation.

Thank you in advance!