On One Side Kolmogorov Type Inequalities

时间:2024-08-02 11:34:20

Let \(X_1,X_2,\ldots,X_n\) be independent random variables. Denote

\[S_n=\sum_{i=1}^n X_i.\]

The  well known Kolmogrov inequality can be stated as for all \(\varepsilon> 0\) \[P\left(\max_{1\le j\le n}|S_j|\ge \varepsilon\right)\le\frac{Var(S_n)}{\varepsilon^2}.\]

The one side kolmogrov type ineqalites are stated as for all \(\varepsilon>0\)\[ P\left(\max_{1\le j\le n}S_j\ge \varepsilon\right)\le\frac{Var(S_n)}{\varepsilon^2+Var(S_n)}.\]  We will prove this inequality in the following.

Proposition. Let\(X\) be a random variable with \(Var(X)<\infty\). Then for all \(\varepsilon>0\)\[ P(X\ge \varepsilon)\le\frac{Var(X)}{\varepsilon^2+Var(X)}.\]

Proof. Without loss of generality, we may assume that \(E(X)=0\). Then \[ \varepsilon=E(\varepsilon - X)=E\{(\varepsilon  - X)I_{X<\varepsilon}\}+E\{(\varepsilon  - X)I_{X\ge \varepsilon}\}\le E\{(\varepsilon  - X)I_{X< \varepsilon}\}.\] By Cauchy-Schwardz's inequality, We have\[ \varepsilon^2\le \left[E\{(\varepsilon  - X)I_{X<\varepsilon}\}\right]^2\le E(\varepsilon  + X)^2P(X<\varepsilon)=[\varepsilon^2+Var(X)][1-P(\ge\varepsilon)].\] Therefor,\[P(X\ge \varepsilon)\le\frac{Var(X)}{\varepsilon^2+Var(X)}.\]

Proof of the one side Kolmogorov type inequality. Let \(\Lambda=\{\max_{1\le j\le n}S_j \ge \varepsilon\}\) and \(\Lambda_k=\{\max_{1\le j <k}S_j < \varepsilon, S_k\ge\varepsilon\}\), then \(\Lambda =\bigcup_{k=1}^{n}\Lambda_k\). Without loss of generality, we assume that \(E(X_j)=0,j=1,\ldots,n.\) Then by the independence of the random variables,\[\begin{array}{rcl}\varepsilon&=&E[\varepsilon -S_n]=E[(\varepsilon-S_n)I_{\Lambda}]+[(\varepsilon-S_n)I_{\Lambda}^c]\\ &=&\sum_{k=1}^n[(\varepsilon-S_n)I_{\Lambda_k}]+[(\varepsilon-S_n)I_{\Lambda^c}]\\ &=&\sum_{k=1}^n E[\{(\varepsilon-S_k) -(S_n-S_k)\}I_{\Lambda_k}]+[(\varepsilon-S_n)I_{\Lambda^c}]\\ &=&\sum_{k=1}^n E[(\varepsilon-S_k)I_{\Lambda_k}]+\sum_{k=1}^n[E(S_n-S_k)I_{\Lambda_k}]+E[(\varepsilon-S_n)I_{\Lambda^c}]\\ &=&\sum_{k=1}^n E[(\varepsilon-S_k)I_{\Lambda_k}]+E[(\varepsilon-S_n)I_{\Lambda^c}]\\ &\le&E[(\varepsilon-S_n)I_{\Lambda^c}].\end{array}\]

By Cauchy-Schwardz's inequality, we have

\[\varepsilon^2\le \{E[(\varepsilon-S_n)I_{\Lambda^c}]\}^2\le E[(\varepsilon-S_n)]^2 P(I_{\Lambda^c})=[\varepsilon^2+Var(S_n^2)][1-P(\Lambda)].\] Therefore,

\[ P\left(\max_{1\le j\le n}S_j\ge \varepsilon\right)\le\frac{Var(S_n)}{\varepsilon^2+Var(S_n)}\] as the inequality claimed.

Remark. The one side Kolmogorov type inequality is also true for martingale difference sequence as well as demimartingales, the proof is samilar.