基本概率分布Basic Concept of Probability Distributions 6: Exponential Distribution

时间:2023-03-08 18:41:48

PDF version

PDF & CDF

The exponential probability density function (PDF) is $$f(x; \lambda) = \begin{cases}\lambda e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$ The exponential cumulative distribution function (CDF) is $$F(x; \lambda) = \begin{cases}1 - e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$

Proof:

$$ \begin{align*} F(x; \lambda) &= \int_{0}^{x}f(x; \lambda)\ dx\\ &= \int_{0}^{x}\lambda e^{-\lambda x}\ dx \\ &= \lambda\cdot\left(-{1\over\lambda}\right)\int_{0}^{x}e^{-\lambda x}\ d(-\lambda x)\\ &= -e^{-\lambda x}\Big|_{0}^{x}\\ &= 1 - e^{-\lambda x} \end{align*} $$ And $$F(\infty) = 1$$

Mean

The expected value is $$\mu = E[X] = {1\over\lambda}$$

Proof:

$$ \begin{align*} E\left[X^k\right] &= \int_{0}^{\infty}x^kf(x; \lambda)\ dx\\ &= \int_{0}^{\infty}x^k\lambda e^{-\lambda x}\ dx\\ &= -x^ke^{-\lambda x}\Big|_{0}^{\infty} + \int_{0}^{\infty}e^{-\lambda x}kx^{k-1}\ dx\quad\quad\quad\quad(\mbox{integrating by parts})\\ &= 0 + {k\over \lambda}\int_{0}^{\infty}x^{k-1}\lambda e^{-\lambda x}\ dx\\ &= {k\over\lambda}E\left[X^{k-1}\right] \end{align*} $$ Using the integrating by parts: $$u= x^k\Rightarrow du = kx^{k-1}\ dx,\ dv = \lambda e^{-\lambda x}\Rightarrow v = \int\lambda e^{-\lambda x}\ dx = -e^{-\lambda x}$$ $$\implies \int x^k\lambda e^{-\lambda x}\ dx =uv - \int vdu = -x^ke^{-\lambda x} + \int e^{-\lambda x}kx^{k-1}\ dx$$ Hence setting $k=1$: $$E[X]= {1\over\lambda}$$

Variance

The variance is $$\sigma^2 = \mbox{Var}(X) = {1\over\lambda^2}$$

Proof:

$$ \begin{align*} E\left[X^2\right] &= {2\over\lambda} E[X] \quad\quad \quad\quad (\mbox{setting}\ k=2)\\ &= {2\over\lambda^2} \end{align*} $$ Hence $$ \begin{align*} \mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\ &= {2\over\lambda^2} - {1\over\lambda^2}\\ &= {1\over\lambda^2} \end{align*} $$

Examples

1. Let $X$ be exponentially distributed with intensity $\lambda$. Determine the expected value $\mu$, the standard deviation $\sigma$, and the probability $P\left(|X-\mu| \geq 2\sigma\right)$. Compare with Chebyshev's Inequality.

Solution:

$$\mu = {1\over\lambda},\ \sigma = {1\over\lambda}$$ The probability that $X$ takes a value more than two standard deviations from $\mu$ is $$ \begin{align*} P\left(|X - \mu| \geq 2\sigma\right) &= P\left(X \geq {3\over \lambda} \right)\\ &= 1-F\left({3\over\lambda}\right)\\ &= e^{-3}= 0.04978707 \end{align*} $$ Chebyshev's Inequality gives the weaker estimation $$P\left(|X - \mu| \geq 2\sigma\right) \leq {1\over4} = 0.25$$

2. Suppose that the length of a phone call in minutes is an exponential random variable with parameter $\lambda = {1\over10}$. If someone arrives immediately ahead of you at a public telephone booth, find the probability that you will have to wait (a) more than 10 minutes; (b) between 10 and 20 minutes.

Solution:

Let $X$ be the length of the call made by the person in the booth. And $$f(x) = {1\over10}e^{-{1\over10}x},\ F(x) = 1-e^{-{1\over10}x}$$ (a) $$ \begin{align*} P( X > 10) &= 1 - P(X \leq 10)\\ &= 1 - F(10)\\ &= e^{-1}= 0.3678794 \end{align*} $$ (b) $$ \begin{align*} P(10 < X < 20) &= P(X < 20) - P(X < 10)\\ &= F(20) - F(10)\\ &= (1-e^{-2}) - (1 - e^{-1})\\ &= e^{-1} - e^{-2} = 0.2325442 \end{align*} $$

Reference

  1. Ross, S. (2010). A First Course in Probability (8th Edition). Chapter 5. Pearson. ISBN: 978-0-13-603313-4.
  2. Brink, D. (2010). Essentials of Statistics: Exercises. Chapter 5. ISBN: 978-87-7681-409-0.