Skip to content

Mutual Information

How much information one random variable says about another random variable.


Intiution

  • Measure of the amount of information that one RV contains about another RV
  • Reduction in the uncertainty of one rv due to knowledge of another
  • The intersection of information in X with information in Y

Full Definition

I(X;Y) = \sum_{x,y} p(x,y) \log \frac{p(x,y)}{p(x)p(y)}
I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)
I(X;Y) = H(X) + H(Y) - H(X,Y)

Sources: * Scholarpedia


Code

  1. We need a PDF estimation...

  2. Normalize counts to probability values

pxy = bin_counts / float(np.sum(bin_counts))
  1. Get the marginal distributions
px = np.sum(pxy, axis=1) # marginal for x over y
py = np.sum(pxy, axis=0) # marginal for y over x
  1. Joint Probability

Supplementary

Information

Intuition

Things that don't normally happen, happen.



Supplementary