PDA

View Full Version : 2D Wave Ripple Algorithm Problem



Rammschnev
01-13-2017, 07:43 PM
I'm developing a 2D HTML5 game that includes a water rippling effect as a background (so it does not directly interact with game objects). I was able to successfully implement and tweak the effect in JavaScript, but the performance was much too slow to be acceptable. As this effect works by performing calculations per individual pixel in the background, I rewrote the effect to work as a series of fragment shaders. It does basically what it's supposed to do, but with some unexpected differences from the original effect. My hypothesis is that the reason must lie in something I don't understand about the way my shaders are executed (and this is the first work I've done with GL, so I expect there is a lot I don't understand).

I don't believe excerpts of my code should be necessary to answer my question, but if anyone wants to see it, I'd be happy to post the relevant bits as well as the source material explaining the algorithm that I use. In case it's relevant, I am using WebGL, not standard OpenGL.

In broad terms, the algorithm works by assigning a height value for each pixel of the image meant to represent the surface of the water and storing the value in a matrix the same size as the image. (As I'm using fragment shaders, this matrix is actually a GL texture with grayscale values representing the data.) Each frame, the updated value for the height of a given pixel is calculated using the average of the heights of the pixels around it. The waves themselves are an emergent property that arise once you create a disturbance somewhere in the height values. There is more to the effect, but this is the piece that is not acting quite as it should.

It almost works as it should. There are a few subtle things that don't look exactly right, but the most informative of them is the appearance of waves that, rather than expanding outward and diminishing, instead slowly collapse inward on themselves. Rather than the forces of the waves acting on the undisturbed parts of the water, the undisturbed water is exerting itself on the waves, pressing in on them from the outside until they are smothered out, which is basically inside out. These pockets of inside-out waves also tend to very slightly move as a whole, as though they were a solid object.

I'm not aware of anything that should be causing the algorithm to behave differently when executed in a GL context. As I understand it, fragment shaders perform individual per-fragment calculations all at once, versus one calculation at a time as a CPU would execute, but they are referring to the same data, which is not being changed until the end of the frame. Tweaking values have not changed the unwanted behavior. Is there something relevant about the GL pipeline that I'm missing here?

Thanks!

GClements
01-13-2017, 11:17 PM
The biggest difference (particularly for ES) is the limited range and precision of the values. Scalar values stored in a texture will be clamped to the range [0,1] and will typically have no more than 8 bits of precision, perhaps fewer. Also, internal calculations may have as few as 10 bits of precision.

You say the data isn't being updated until the end of the frame, so presumably you're using two textures and alternating which one is source and which is destination. Using a single texture for both has undefined behaviour (moreover, it's best to explicitly unbind textures from texture units before attaching them to a framebuffer and vice versa).

Rammschnev
01-14-2017, 12:10 PM
You say the data isn't being updated until the end of the frame, so presumably you're using two textures and alternating which one is source and which is destination. Using a single texture for both has undefined behaviour (moreover, it's best to explicitly unbind textures from texture units before attaching them to a framebuffer and vice versa).

Correct, I am using two different textures and alternating between them. I do use the function copyTexImage2D to copy from a framebuffer into the active texture at some points, but I will see where I can explicitly unbind textures without breaking behavior.


The biggest difference (particularly for ES) is the limited range and precision of the values. Scalar values stored in a texture will be clamped to the range [0,1] and will typically have no more than 8 bits of precision, perhaps fewer. Also, internal calculations may have as few as 10 bits of precision.

I was aware that the values were clamped, but I was not aware that precision was limited. When I adapted the algorithm, I scaled my values down to fall between 0 and 1, and since that meant that going below 0 was no longer an option, I also adjusted it so that 0.5 would be considered the neutral height rather than 0. However, I'm not sure if this is what's causing the behavior, because I'm able to see quite a lot of detail in all of the waves, including the fact that the misbehaving waves are still moving up and down as they are supposed to, just shrinking rather than expanding. The more I think about it, it's probably something with my code, because these waves also appear under specific conditions, when a lot of waves collide against each other at once. I'm just not sure where a problem like this could have entered in. In my head, the behavior should have come out identical. I'll update this thread if I figure anything else out about it.