Hi all,
Ive lately been working on a real time fluid dynamics solver, and its mostly been fun and easy to code. However, ive come up against a problem in the diffusion step of the equation.
Originally, i faked the diffusion of the velocity field by using Oddlab’s algorithm for Thermal Decomposition, which is basically this:
For every pixel, find the von-neumann neighbourhood
For every neighbour, find the difference between the neighbour and the current pixel
Find the biggest difference among the neighbours
Divide that difference by 2
Add the new divided difference to the lowest point of the neighbours (the one you obtained the difference from)
Subtract the divided difference from the current pixel
And that has worked fine so far. However, as you can see, the sampling of the neighbours (i.e. extend the setting of the current pixel out to include not just the von-neumann, but also the neighbours’ neighbours) or the difference, does not take into account the dt of that frame, and as such, the algorithm becomes extremely unstable and the simulation blows up when dt is high.
So, anybody has any ideas on how to extend that algorithm further to include dt ?
DP