0 like 0 dislike
34 views
What does the Shannon mutual information measure?
| 34 views

0 like 0 dislike
Dependence between sets of discrete random variables $\mathbf{X}$ and $\mathbf{Y}$ can be measured by the Shannon mutual information [Cover and Thomas, 1991]
$I(\mathbf{X}: \mathbf{Y}):=\sum_{\mathbf{x}, \mathbf{y}} p(\mathbf{x}, \mathbf{y}) \log \frac{p(\mathbf{x}, \mathbf{y})}{p(\mathbf{x}) p(\mathbf{y})}$
by Platinum (141,884 points)

0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike
0 like 0 dislike