Imagine that I have a set of measurements of x that are taken by many processes x0 ... xN at times t0 ... tN
You could try this:
Keep an estimator zn so that at each event:
zn = (zn-1+κ).e-κ.(tn-tn-1)
This will converge towards the event rate in s-1. A sligtly better estimator is then (as there is still an error/noise related if you compute the estimate just before or just after an event) :
wn = zn.e-κ/(2.zn)
In your example it will converge to 2s-1 (the inverse of 500ms) The constant κ is responsible for the smoothing and is in s-1. Small values will smooth more. If your event rate is roughly of seconds, a value of 0.01s-1
This method has a starting bias, and z0 could be set to an estimate of the value for faster convergence. Small values of κ will keep the bias longer.
There are much more powerful ways of analyzing poisson-like distributions, but they often require large buffers. Frequency analysis such as Fourier transform is one.