I am trying to calculate the moving average of a signal. The signal value ( a double ) is updated at random times. I am looking for an efficient way to calculate it\'s time wei
If an approximation is OK and there's a minimum time between samples, you could try super-sampling. Have an array that represents evenly spaced time intervals that are shorter than the minimum, and at each time period store the latest sample that was received. The shorter the interval, the closer the average will be to the true value. The period should be no greater than half the minimum or there is a chance of missing a sample.