I am developing a fleet management system and one of the tasks is to show a chart representing the fuel consumption of the vehicle (based on a data that is coming from the C
Without sharing data with the draining it is this more or less just educated guess...
I would try a sliding average (with a window at least of size of the bump) that will smooth out the bumps, but could destroy the draining as we do not know what properties such signals have.
So I would try something like this:
find bump max period
If it's a ship and tank with a constant shape then the maximum bump period is fixed from the maximum wave size the ship is capable of withstand and the length of the ship and scaled by the tank shape and size. If you do not know this period, you can measure it on the fly by finding a few consequent local minima/maxima (peeks) and take the maximum distance between them.
Create a function that detects draining
How to do it? I can not say as I have no idea how the data looks like as you did not share it.
Process your data (ahead enough samples)
So I would construct a FIFO of size equal to your sliding average window. First fill the FIFO with starting samples entirely and then pass your samples to it. But instead of storing/showing/plotting the output value, use average of all the values in the FIFO instead.
Here is the output for your bumped signal before (black) and after (blue) sliding average of FIFO (window) size 9 samples:
Beware the data is time shifted (delayed) from the original signal by the half size of the FIFO (I shifted it back on the plot so the plots corresponds to each other).
If draining is detected and sliding average destroys it
You need to temporarily disable sliding average before the draining starts (just by using smaller sliding window size). You can interpolate the size from original value to 1 linearly just before draining... And then back after it, so the signal will not lose smoothness. However, it's possible that a sliding average will preserve the draining information entirely and this part is not needed. There is no way knowing without relevant data.
If you need something more advanced then if you got more detailed data you can assume that the shape of the bumps have a specific shape (due to shape of tank in which the fuel oscillates). So you could FFT the signal, remove frequencies specific to the bumps and reconstruct back by inverse FFT. If you want to avoid the complex domain, you could try to do this with DCT. However, for such tasks we do not have enough measured data.
Are there ways to determine when fueling/draining are occurring? If so, then you could change your algorithm at those times dynamically.
Otherwise, I would recommend using exponential smoothing.
Let d (0 <= d < 1) be weight factor for previous number. So displayed_number = prev_data*d + new_data*(1-d)
With a proper weight factor, it would seem the "bumpiness" would be removed, yet at the same time the result would reflect fuel events.
This isn't the only option, more of an example algorithm, but I hope you find it useful.
Small edit: I had not realized that exponential smoothing had a proper name. I had merely used the technique when displaying frame rates within games I create. So, thank you Kemper.
As I understand, you want the small variations to disappear, but keep the big jumps without smoothing. Probably the moving median is what you are looking for. It preserves the big jumps without smoothing (edge preserving property).
I am not sure it is the best method for you. I would have to see your data.