Entropy estimation from discrete variable

青春壹個敷衍的年華 提交于 2021-02-11 16:46:52

问题


I am trying to estimate conditional entropy of 3 variables using pyitlib package for python. But i am encountering error

My code:

import numpy as np
from pyitlib import discrete_random_variable as drv

X=[0,1,1,0,1,0,1,0,0,1,0,0]
Y=[0,1,1,0,0,0,1,0,0,1,1,0]
Z=[1,0,0,1,1,0,0,1,1,0,0,1]

a=drv.entropy_conditional('X','Y','Z')
print(a)

The error:

Warning (from warnings module):
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\numpy\lib\arraysetops.py", line 518
    mask &= (ar1 != a)
FutureWarning: elementwise comparison failed; returning scalar instead, but in the future will perform elementwise comparison
Traceback (most recent call last):
  File "C:\Users\user\Downloads\testing.py", line 8, in <module>
    a=drv.entropy_conditional('X','Y','Z')
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\pyitlib\discrete_random_variable.py", line 3495, in entropy_conditional
    fill_value_Alphabet_Y))
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\pyitlib\discrete_random_variable.py", line 4689, in _map_observations_to_integers
    Fill_values = [L.transform(np.atleast_1d(f)) for f in Fill_values]
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\pyitlib\discrete_random_variable.py", line 4689, in <listcomp>
    Fill_values = [L.transform(np.atleast_1d(f)) for f in Fill_values]
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\sklearn\preprocessing\label.py", line 257, in transform
    _, y = _encode(y, uniques=self.classes_, encode=True)
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\sklearn\preprocessing\label.py", line 110, in _encode
    return _encode_numpy(values, uniques, encode)
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\lib\site-packages\sklearn\preprocessing\label.py", line 53, in _encode_numpy
    % str(diff))
ValueError: y contains previously unseen labels: [-1]

I am trying to find conditional mutual information between X,Y,Z(values given above) using the formula:

I(X;Y|Z)=H(X,Z)+H(Y,Z)-H(X,Y,Z)-H(Z)

The expected Conditional Mutual information value is= 0.011


回答1:


the method entropy_conditional is expecting an array, not strings, hence the error you receive. You can simply change the line to:

a = drv.entropy_conditional(X,Y,Z)

Documentation is available here




回答2:


This question appears to have been subsumed by a near-duplicate, please see this repsponse on how to compute conditional entropy.



来源:https://stackoverflow.com/questions/55400346/entropy-estimation-from-discrete-variable

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!