Ed Daenar

08-08-2013, 10:17 AM

Imagine this fragment shader:

#version 330

in vec2 uvCoord;

uniform sampler2D textureMap;

layout(location = 0) out vec4 fragment;

void main() {

//Get a sample from a texture map

fragment.rgb = texture(textureMap,uvCoord).rgb;

//Get the gradient of the fragment value. Is this legal?

fragment.rgb = vec3(1-length(fwidth(fragment.rgb)));

fragment.a = 1;

}

Unless I'm missunderstanding things, dFdx/dy returns the derivative of a value for x and y respectively, in screen space. So using dFdx on a vec3, for example, would return another vec3 with the derivatives of for each vector component in the x direction on screen space (so, uh, right-left :P). If what I'm understand is correct, this checks the specified variable's value of the neighboring fragments to calculate this derivative, so it shows the rate of change between the "current" fragment and the neighbor ones.

The gradient can be computed as G = sqrt(dFdx*dFdx + dFdy*dFdy), which if done on a vec3 would result in the individual gradients for each component. I'm visualizing this as a vector in 3D space pointing to the direction of change. If the vec3 was a color, it would be in color space. I assume this is an acceptable visualization, is it?

By obtaining the length of this vector I can obtain the magnitude of the change, regardless of the direction.

So by performing the above shader instructions I'm basically asking for the maginute of change on the colors being written to the image, which would conceptually result (assuming I'm interpreting this correctly) in a filter-like effect where high frequencies are shown in black color and low frequencies in white. In other words, it creates a sketch-like effect. Now I realize this is not the ideal way of doing this effect, but it's not what I wanted to discuss.

The question is: is this even legal? The value of the fragment is still being defined by the time the dFd* functions are called, so it must mean that it has to fetch the value being worked on at the moment in other fragment units. So I'm right now wondering if this wouldn't bring a lot of problems as some fragments may not actually have neighboring fragments being processed at the time.

Indeed, it seems to work like this as changing the moment when a dFd* function is called will operate on the state of the value at the current time.

With my test machine I've been able to apply this to any value within a shader, not only inputs, but I cannot find any information regarding if this is legal or just undefined behaviour.

If anyone can spare an explanation of how the derivatives are obtained that would help too. Not the math behind it but the source of the data; is it coming from fragments isolated to the primitive the fragment pertains to or does it "leak" onto other fragments as well?

#version 330

in vec2 uvCoord;

uniform sampler2D textureMap;

layout(location = 0) out vec4 fragment;

void main() {

//Get a sample from a texture map

fragment.rgb = texture(textureMap,uvCoord).rgb;

//Get the gradient of the fragment value. Is this legal?

fragment.rgb = vec3(1-length(fwidth(fragment.rgb)));

fragment.a = 1;

}

Unless I'm missunderstanding things, dFdx/dy returns the derivative of a value for x and y respectively, in screen space. So using dFdx on a vec3, for example, would return another vec3 with the derivatives of for each vector component in the x direction on screen space (so, uh, right-left :P). If what I'm understand is correct, this checks the specified variable's value of the neighboring fragments to calculate this derivative, so it shows the rate of change between the "current" fragment and the neighbor ones.

The gradient can be computed as G = sqrt(dFdx*dFdx + dFdy*dFdy), which if done on a vec3 would result in the individual gradients for each component. I'm visualizing this as a vector in 3D space pointing to the direction of change. If the vec3 was a color, it would be in color space. I assume this is an acceptable visualization, is it?

By obtaining the length of this vector I can obtain the magnitude of the change, regardless of the direction.

So by performing the above shader instructions I'm basically asking for the maginute of change on the colors being written to the image, which would conceptually result (assuming I'm interpreting this correctly) in a filter-like effect where high frequencies are shown in black color and low frequencies in white. In other words, it creates a sketch-like effect. Now I realize this is not the ideal way of doing this effect, but it's not what I wanted to discuss.

The question is: is this even legal? The value of the fragment is still being defined by the time the dFd* functions are called, so it must mean that it has to fetch the value being worked on at the moment in other fragment units. So I'm right now wondering if this wouldn't bring a lot of problems as some fragments may not actually have neighboring fragments being processed at the time.

Indeed, it seems to work like this as changing the moment when a dFd* function is called will operate on the state of the value at the current time.

With my test machine I've been able to apply this to any value within a shader, not only inputs, but I cannot find any information regarding if this is legal or just undefined behaviour.

If anyone can spare an explanation of how the derivatives are obtained that would help too. Not the math behind it but the source of the data; is it coming from fragments isolated to the primitive the fragment pertains to or does it "leak" onto other fragments as well?