COULD SOMEONE MOVE THIS TO THE GENERAL DISCUSSION SECTION. @_@
Hello! This is a question that’s for all ya’ll who have more experience in graphics and algorithms than myself. I’m working on a project (For work, and I did get permission to ask around) that requires the smoothing of a large set of data.
The basics of is that there is a data set that is a 3D matrix filled with ‘slowness’ values which is basically ‘When moving through this cell how many units do you move per time step?’ Initially, these values are filled in with a interpolated gradient from some min at the top to some max at the bottom. Then, this data set has a function applied to it repeatedly that takes the data set and another set of inputs, and updates certain cells in the matrix to new values. The updated data set is then fed back into the algorithm again and again until it converges or some maximum number of iterations is achieved.
This is where the major problems begin. For the most part, the function expects that there is some degree of smoothness to the data set, IE- there is some finite, but unknown, allowed difference between values in adjacent cells for which the algorithm will produce meaningful data. This is due to the fact that there is no guarantee that each application of the function will touch the same cells, only that it should touch cells that are close to those touched in the prior step. Due to the fact that the starting estimation (The gradient) is so bad the updates result in large discrepancies between updated and unupdated cells. To help combat this a filtering operation, namely a smoothing, is applied to the the entire data set before it is fed back into the algorithm. Currently, the smoothing operation is a Sliding Windowed Mean with the window size that changes as the iterations continue and as the estimation becomes more exact. Typically it starts out at a window size that is the size of the data set with a slide of 1 (In each direction).
For the size of data set that we’re typically work on, this takes a large portion of time (Think a data set that’s about 100x50x50, with a sliding windowed mean of 100x50x50 and a slide of in the (x,y,z) direction). This has two major problems: First, it takes a long time to perform this operations, and Second, at least in early iterations there is a ‘large’ contribution to the value of cells from those that are relatively far away.
So, while it would be nice to figure out a better initial data set, IE- One that will result in a smaller degree in discrepancies between early applications of the function, I’ve been asked to work on the problem of making the filtering itself more efficient. So many, I’m asking about various filters and whether anyone has any resources for the use of them in 3D (As in smooth across depth instead of just height and width) resources, anything about how to figure out when the application of a filter will result in little-to-no change (IE- figuring out when I can exclude a cell from having the filter applied to it) and things like that?