Announce

PukiWiki contents have been moved into SONOTS Plugin (20070703)

InformationTheory

Navi


log is log_2 in information theory.

Entropy

H(X) = - \sum_{x \in \chi} p(x)log p(x) = E_p[log {1 \over p(x)}]

Joint

H(X,Y) = \sum_{x \in \chi} \sum_{y \in Y} p(x,y) log p(x,y)

Conditional

H(Y|X) = \sum_{x \in \chi} p(x) H(Y| \chi = x) = - \sum_{x \in \chi} \sum_{y \in Y} p(x,y) log P(y|x)

Theorem

H(X,Y) = H(X) + H(Y|X)
H(X,Y|Z) = H(X|Z) + H(Y|X,Z)

Note: H(Y|X) \ne H(X|Y), but H(X) - H(X|Y) = H(Y) - H(Y|X)

Relative Entropy

D(p||q) = \sum_{x \in \chi} p(x) log {p(x) \over q(x)} = E_p[log {p(x) \over q(x)}]

Mutual Information

I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) = H(X)+H(Y)-H(X,Y) = I(Y;X)

I(X;X) = H(X)