You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
kbriggs:~/Downloads/NPEET> python3 test.py
For a uniform distribution with width alpha, the differential entropy is log_2 alpha, setting alpha = 2
and using k=1, 2, 3, 4, 5
Traceback (most recent call last):
File "./test.py", line 16, in
print("result:", [ee.entropy([[2 * random.random()] for i in range(1000)], k=j + 1) for j in range(5)])
File "./test.py", line 16, in
print("result:", [ee.entropy([[2 * random.random()] for i in range(1000)], k=j + 1) for j in range(5)])
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 28, in entropy
return (const + d * np.mean(map(log, nn))) / log(base)
File "/usr/local/lib/python3.4/dist-packages/numpy/core/fromnumeric.py", line 2909, in mean
out=out, **kwargs)
File "/usr/local/lib/python3.4/dist-packages/numpy/core/_methods.py", line 82, in _mean
ret = ret / rcount
TypeError: unsupported operand type(s) for /: 'map' and 'int'
kbriggs:~/Downloads/NPEET> python2 test.py
For a uniform distribution with width alpha, the differential entropy is log_2 alpha, setting alpha = 2
and using k=1, 2, 3, 4, 5
('result:', [0.95063690299507952, 0.98051458362141108, 1.0803462913574611, 1.0316551234094444, 1.0289725544677049])
IF you permute the indices of x, e.g., MI(X:Y) = 0
('samples used', [10, 25, 50, 100, 200])
('estimated MI', [0.032435448506589186, -0.027013576228861892, -0.0048799193000058135, 0.0023174460892350754, -0.0002141277047037321])
('95% conf int.\n', [(0.28988781354141774, 0.41434201574331025), (0.24203605944116111, 0.29849816049646066), (0.18081726377075832, 0.18040335534919902), (0.15879645329878422, 0.22733498191676946), (0.13263900209136867, 0.13325413690339941)])
Test of the discrete entropy estimators
For z = y xor x, w/x, y uniform random binary, we should get H(x)=H(y)=H(z) = 1, H(x:y) etc = 0, H(x:y|z) = 1
Traceback (most recent call last):
File "./test.py", line 116, in
print("H(x), H(y), H(z)", ee.entropyd(x), ee.entropyd(y), ee.entropyd(z))
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 114, in entropyd
return entropyfromprobs(hist(sx), base=base)
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 149, in hist
sx = discretize(sx)
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 280, in discretize
return [discretize_one(x) for x in xs]
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 275, in discretize_one
if len(x) > 1:
TypeError: object of type 'int' has no len()
The text was updated successfully, but these errors were encountered:
kbriggs:~/Downloads/NPEET> python3 test.py
For a uniform distribution with width alpha, the differential entropy is log_2 alpha, setting alpha = 2
and using k=1, 2, 3, 4, 5
Traceback (most recent call last):
File "./test.py", line 16, in
print("result:", [ee.entropy([[2 * random.random()] for i in range(1000)], k=j + 1) for j in range(5)])
File "./test.py", line 16, in
print("result:", [ee.entropy([[2 * random.random()] for i in range(1000)], k=j + 1) for j in range(5)])
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 28, in entropy
return (const + d * np.mean(map(log, nn))) / log(base)
File "/usr/local/lib/python3.4/dist-packages/numpy/core/fromnumeric.py", line 2909, in mean
out=out, **kwargs)
File "/usr/local/lib/python3.4/dist-packages/numpy/core/_methods.py", line 82, in _mean
ret = ret / rcount
TypeError: unsupported operand type(s) for /: 'map' and 'int'
kbriggs:~/Downloads/NPEET> python2 test.py
For a uniform distribution with width alpha, the differential entropy is log_2 alpha, setting alpha = 2
and using k=1, 2, 3, 4, 5
('result:', [0.95063690299507952, 0.98051458362141108, 1.0803462913574611, 1.0316551234094444, 1.0289725544677049])
Gaussian random variables
Conditional Mutual Information
covariance matrix
[[4 3 1]
[3 4 1]
[1 1 2]]
('true CMI(x:y|x)', 0.5148736716970265)
('samples used', [10, 25, 50, 100, 200])
('estimated CMI', [0.24721094773861269, 0.39550091844389834, 0.46211431227905897, 0.48994541664326197, 0.49993287186420526])
('95% conf int. (a, b) means (mean - a, mean + b)is interval\n', [(0.32947891495120885, 0.47883105656907937), (0.42410443410041138, 0.40553319741348437), (0.29607520550148525, 0.29646667472554578), (0.17646000212101254, 0.19139043703562886), (0.1623733550388789, 0.19292824321772967)])
Mutual Information
('true MI(x:y)', 0.5963225389711981)
('samples used', [10, 25, 50, 100, 200])
('estimated MI', [0.32218586252030301, 0.54386805987295483, 0.59630897787131887, 0.60762939695898355, 0.60418593716673841])
('95% conf int.\n', [(0.42363251380820954, 0.46980791508516928), (0.46583034399247247, 0.50990786157500079), (0.35170125121037665, 0.33635610406503746), (0.23654160340493391, 0.30032100823502828), (0.2007355329953654, 0.17193438029361319)])
IF you permute the indices of x, e.g., MI(X:Y) = 0
('samples used', [10, 25, 50, 100, 200])
('estimated MI', [0.032435448506589186, -0.027013576228861892, -0.0048799193000058135, 0.0023174460892350754, -0.0002141277047037321])
('95% conf int.\n', [(0.28988781354141774, 0.41434201574331025), (0.24203605944116111, 0.29849816049646066), (0.18081726377075832, 0.18040335534919902), (0.15879645329878422, 0.22733498191676946), (0.13263900209136867, 0.13325413690339941)])
Test of the discrete entropy estimators
For z = y xor x, w/x, y uniform random binary, we should get H(x)=H(y)=H(z) = 1, H(x:y) etc = 0, H(x:y|z) = 1
Traceback (most recent call last):
File "./test.py", line 116, in
print("H(x), H(y), H(z)", ee.entropyd(x), ee.entropyd(y), ee.entropyd(z))
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 114, in entropyd
return entropyfromprobs(hist(sx), base=base)
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 149, in hist
sx = discretize(sx)
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 280, in discretize
return [discretize_one(x) for x in xs]
File "/home/kbriggs/Downloads/NPEET/entropy_estimators.py", line 275, in discretize_one
if len(x) > 1:
TypeError: object of type 'int' has no len()
The text was updated successfully, but these errors were encountered: