Jensen-Shannon Divergence

前端 未结 5 1454
醉话见心
醉话见心 2021-01-31 05:22

I have another question that I was hoping someone could help me with.

I\'m using the Jensen-Shannon-Divergence to measure the similarity between two probability distribu

5条回答
  •  迷失自我
    2021-01-31 06:06

    Note that the scipy entropy call below is the Kullback-Leibler divergence.

    See: http://en.wikipedia.org/wiki/Jensen%E2%80%93Shannon_divergence

    #!/usr/bin/env python
    from scipy.stats import entropy
    from numpy.linalg import norm
    import numpy as np
    
    def JSD(P, Q):
        _P = P / norm(P, ord=1)
        _Q = Q / norm(Q, ord=1)
        _M = 0.5 * (_P + _Q)
        return 0.5 * (entropy(_P, _M) + entropy(_Q, _M))
    

    Also note that the test case in the Question looks erred?? The sum of the p distribution does not add to 1.0.

    See: http://www.itl.nist.gov/div898/handbook/eda/section3/eda361.htm

提交回复
热议问题