Jensen-Shannon Divergence

前端 未结 5 1455
醉话见心
醉话见心 2021-01-31 05:22

I have another question that I was hoping someone could help me with.

I\'m using the Jensen-Shannon-Divergence to measure the similarity between two probability distribu

5条回答
  •  醉话见心
    2021-01-31 06:25

    Get some data for distributions with known divergence and compare your results against those known values.

    BTW: the sum in KL_divergence may be rewritten using the zip built-in function like this:

    sum(_p * log(_p / _q) for _p, _q in zip(p, q) if _p != 0)
    

    This does away with lots of "noise" and is also much more "pythonic". The double comparison with 0.0 and 0 is not necessary.

提交回复
热议问题