# Hoeffding’s Inequality

Let ${X_1,...,X_n}$ be independent zero-mean real valued variables and let ${S_n= \sum\limits_{i=1}^{n} X_i.}$ Then

$\displaystyle \begin{array}{rcl} \text{If } a_i \leq X_i \leq b_i; \qquad &i&=1,..,n \text{ where } a_1, b_1,...,a_n, b_n \text{ constant then } \\ P(|S_n| \geq t) &\leq& 2 \exp \left( - \frac{2t^2}{\sum\limits_{i=1}^{n} (b_i-a_i)^2} \right), \qquad t>0 \end{array}$

# An inequality of the mean involving truncation

Let ${X_1,X_2,...}$ be i.i.d. r.vs with ${\mathbb{E}[|X_i|] < \infty}$ and ${Y_k = X_k \mathbb{I}_{(|X_k| \leq k)}}$. Then

$\displaystyle \boxed{ \mathbb{E}[X_1] \geq \sum_{k=1}^{\infty} \frac{var(Y_k)}{4 k^2 } }$

Proof:

First we proove the following useful result

If ${X \geq 0}$ and ${ a > 0}$ then

$\displaystyle \boxed{ \mathbb{E}[X^a] = \int_{0}^{\infty} a x^{a-1} P(X >x) dx }$

$\displaystyle \begin{array}{rcl} \int_{0}^{\infty} a x^{a-1} P(X >x) dx &=& \int_{0}^{\infty} \int_{\Omega}a x^{a-1} \mathbb{I}_{(X>x)} dP dx \\ &=& \int_{\Omega} \int_{0}^{\infty} a x^{a-1} \mathbb{I}_{(X>x)} dP dx \\ &=& \int_{\Omega} \int_{0}^{X} a x^{a-1} dP dx = \mathbb{E}[X^a] \end{array}$

Note you can find the same lemma on Feller Vol.2 (p. 150) as

$\displaystyle \mathbb{E}[X^a]= \int_{0}^{\infty} x^a F \{ dx \} = a \int_{0}^{\infty} x^{a-1} [ 1- F(x)] dx$

# Rio’s Inequality

Let ${X}$ and ${Y}$ be two integrable real-valued random variables and let ${ Q_x(u) = inf\{t: P(|X|>t) \leq u \}}$ be the quantile function of ${|X|}$. Then if ${Q_X Q_Y}$ is integrable over ${ (0,1)}$ we have

$\displaystyle \boxed{|Cov(X,Y)| \leq 2 \int\limits_{0}^{2a} Q_x(u) Q_Y(u) du}$

where ${ a= a(\sigma(X), \sigma(Y)) = \sup\limits_{\substack{B \in \mathcal{B} \\ C \in \mathcal{C}}} |Cov(\mathbb{I}_{\sigma(X)},\mathbb{I}_{\sigma(Y)})|}$ is the a-mixing coefficient.

Proof: Set ${X^{+} = sup(0,X)}$ and ${X^{-} = sup(0,-X)}$ then

$\displaystyle Cov(X,Y) = Cov(X^{+},Y^{+}) + Cov(X^{-},Y^{-}) - Cov(X^{+},Y^{-}) - Cov(X^{-},Y^{+})$

since ${ X = (X^{+} - X^{-})}$ and ${ Y = (Y^{+} - Y^{-})}$

note also that

$\displaystyle Cov(X^+,Y^+) = \int \int_{\mathbb{R}^{2}_{+}} [P(X>u, Y> \upsilon) - P(X>u)P(Y> \upsilon)]du d\upsilon$

which implies that

$\displaystyle |Cov(X^+,Y^+)| \leq \int \int_{\mathbb{R}^{2}_{+}} \inf (a, P(X>u),P(Y> \upsilon))du d\upsilon$

# Kolmogorov’s Maximal Inequality

1. Let ${X_1, X_2,...,X_n}$ be independent random variables with ${ \mathbb{E}[X_i]=0, \mathbb{E}[X_i^2]< \infty }$. Set ${S_n = \sum_{i=1}^{n} X_n}$. Then ${\forall \varepsilon > 0}$

$\displaystyle \boxed{ P \left( \max_{1 \leq k \leq n} |S_K| \geq \varepsilon \right) \leq \frac{\mathbb{E}[S_n^2]}{\varepsilon^2} }$

Proof: Let

$\displaystyle \begin{array}{rcl} A &\equiv& \{ \max_{1 \leq k \leq n} |S_k| \geq \varepsilon \} ,\\ A_k &\equiv& \{ |S_i| < \varepsilon, i=1,...,k-1,|S_k| \geq \varepsilon \}, \qquad 1 \leq k \leq n \end{array}$

Notice that ${\cup_{i=1}^{n}A_k = A}$ and