PukiWiki contents have been moved into SONOTS Plugin (20070703)



$$\frac{n!}{m!} \leq n^{n-m}$$

Markov inequality

Data processing inequality

Fano's inequality

$$H(X|\hat{X}(Y)) \leq H(X|Y) \leq \Pr(X\neq\hat{X})log(|{}\mathcal{X}|-1) + h(\Pr(X\neq\hat{X})) \leq \Pr(X\neq\hat{X})log|{}\mathcal{X}| + 1$$

log-sum inequality

Triangle inequality

In a normed vector space? V, the triangle inequality is

$$\displaystyle \|x + y\| \leq \|x\| + \|y\| \quad \forall \, x, y \in V$$

Cauchy-schwartz inequality

The Cauchy–Schwarz inequality states that for all vectors x and y of a real number|real? or complex number|complex? inner product space?,

$$|\langle x,y\rangle| \leq \|x\| \cdot \|y\|.$$

(Think of <math> \langle x,y \rangle = \|x\|\|y\|cos\theta </math>)

In euclidean space, :<math>\left(\sum_{i=1}^n x_i y_i\right)^2\leq \left(\sum_{i=1}^n x_i^2\right) \left(\sum_{i=1}^n y_i^2\right).</math>

For the inner product space of square-integrable complex-valued functions

\left|\int f(x) \overline{g}(x)\,dx\right|^2\leq\int \left|f(x)\right|^2\,dx \cdot \int\left|g(x)\right|^2\,dx.

A generalization of this is the Hölder inequality?.

Expectation (probability) is inner product of this form.

Jensen's inequality

If f is a convex function $$ f\left(\mathbb{E}\{X\}\right) \leq \mathbb{E}\{f(X)\}.$$

Chebyshev inequality

mean から離れた値の確率は低い。どの確率分布でも。 No more than <math>1/k^2</math> of the values are more than k standard deviations away from the mean.

Let X be a r.v. with expected value <math>\mu</math> and finite variance <math>\sigma^2</math>. Then for any real number? <math>k > 0</math>,

\Pr(\left|X-\mu\right|\geq k\sigma)\leq\frac{1}{k^2}.

Only the cases <math>k > 1</math> provide useful information. This can be equivalently stated as

\Pr(\left|X-\mu\right|\geq \alpha)\leq\frac{\sigma^2}{\alpha^2}.