I have another question that I was hoping someone could help me with.
I\'m using the Jensen-Shannon-Divergence to measure the similarity between two probability distribu
Get some data for distributions with known divergence and compare your results against those known values.
BTW: the sum in KL_divergence may be rewritten using the zip built-in function like this:
sum(_p * log(_p / _q) for _p, _q in zip(p, q) if _p != 0)
This does away with lots of "noise" and is also much more "pythonic". The double comparison with 0.0
and 0
is not necessary.