Originally posted by Nakoruru:
For instances, how would it get the symantic information nessecary to property resolve dPdx(sin(x))? I mean, how does it know that I have not redefined sin to be something completely different?
In function formalism (e.g. in physics), you can define every normal function like sin, lg etc. also to functions in the obvious way by defining for example sin of a function f is defined as (sin(f))(x) := sin(f(x)). This is standard and used all the time, I was used to work in such way all the time.
Originally posted by Nakoruru:
It doesn’t seem right to me to say that everything in the fragment shader is a function of x and y. What if I create a variable float foo, and then take dPdx(foo)? foo is not a function of x and y, its just something I created and can have any value at all with no relationship at all to x or y
A value not depending on x and y is a special case of a function of x and y.
[b]
I think that dPdx is only a way to get at the internal iterator values. Its a good point that texture also cannot be implemented by the language as well, so that is a good argument that you can also call dPdx a function, but it needs more explaining. That is all I wanted, not a long speculation that goes of into fantasy land about how the hardware is going to take derivatives for me.
[/b]
An actual compiler implementation indeed may calculate the derivation by considering the expression and reducing it to dPdx(gl_MultiTexCoord0) etc… But I suppose most compilers (at least at the beginning) won’t support it.
I think there are two aspects:
a) What do we want?
Do we want that dPdx can be applied to any value? My suggestion: Yes. This is very important.
For example, suppose you write a user-defined helper function for distorting texture coordinates, maybe
vec2 myDistort(vec2 t)
{
return vec2(t.x + sin(t.x), t.y + cos(t.y));
}
Then suppose your main function uses such modified texture coordinates:
color = myDistort(texture2(0, glMultiTexCoord0))
Now if you want for example apply antialiasing to your texturing, you need to calculate dPdx from your modified texture coordinates, hence
dPdx(myDistort(texture2(0, glMultiTexCoord0)))
So I think it is very useful and very natural to calculate dPdx from arbitrary values, including results of user-defined functions. (Whether every compilers really supports this is a different iusse, but it would be very useful.)
The second question is:
b) Ok, we want it, but is it possible to define the meaning in an unique, well-defined way?
Answer: Yes. You “simply” realize that anyway all values depend on x and y, so formally everything depends on x and y.
But of course, if one knows how dPdx works, he does not necessarily have to know how exactly it is defined formally and have to go into details we are currently discussing about. But it may be good to know that it can be defined formally precise.
[b]
I’m getting angry, so I’m going to give up here, and wait for someone to reply to my post over at 3Dlabs.
[/b]
Don’t take it too serious…