Moments and Moment Generating Functions

The kth moment of a random variable X is given by E[Xk]. The kth central moment of a random variable X is given by E[(X-E[X])k].

The moment generating function of X is given by:

\begin{displaymath}M(\theta) = E[e^{X\theta}] = \int_{-\infty}^{\infty} e^{x\theta} f(x) dx.
\end{displaymath} (9)

If X is non-negative, we can define its Laplace transform:

\begin{displaymath}L(s) = M(-s) = \int_{0}{\infty} e^{-sx}f(x)dx.
\end{displaymath} (10)

Taking the power series expansion of $e^{X\theta}$ yields:

\begin{displaymath}e^{X \theta} = 1+X \theta + \frac{X^{2} \theta^{2}}{2!} + \cdots
\end{displaymath} (11)

Taking the expectation yields:

\begin{displaymath}E[e^{X \theta}] = 1+E[X] \theta + \frac{E[X^{2}] \theta^{2}}{2!} + \cdots
\end{displaymath} (12)

We can then find the kth moment of X by taking the kth derivative of the moment generating function and setting $\theta=0$.

\begin{displaymath}E[X^{k}] = \frac{d^{k}M(\theta)}{d\theta^{k}}\left\vert _{\theta=0} \right.
\end{displaymath} (13)

For the Laplace transform, the moments can be found using:

\begin{displaymath}E[X^{k}] = (-1)^{k} \frac{d^{k}L(s)}{ds^{k}}\left\vert _{s=0} \right.
\end{displaymath} (14)

Example:

\begin{displaymath}f(x)=\lambda e^{-\lambda x} \vspace{0.3in} \mbox{for $x>0$ }
\end{displaymath} (15)


$\displaystyle M(\theta)$ = $\displaystyle \int_{0}^{\infty} e^{x\theta} \lambda e^{-\lambda x} dx$ (16)
  = $\displaystyle \lambda \int_{0}^{\infty} e^{(\theta-\lambda)x} dx$ (17)
  = $\displaystyle \frac{\lambda}{\theta - \lambda} e^{(\theta-\lambda)x} \vert _{0}^{\infty}$ (18)
  = $\displaystyle \frac{\lambda}{\lambda - \theta} \vspace{0.3in} \mbox{for $\theta <\lambda$ }$ (19)


\begin{displaymath}E[X] = \frac{dM(\theta)}{d\theta}\left\vert _{\theta=0} \righ...
...da}{(\lambda-\theta)^{2}}\vert _{\theta=0} = \frac{1}{\lambda}
\end{displaymath} (20)


\begin{displaymath}E[X^{2}] = \frac{d^{2}M(\theta)}{d\theta^{2}}\left\vert _{\th...
...
= \frac{2 \lambda^{2}}{\lambda^{4}} = \frac{2}{\lambda^{2}}
\end{displaymath} (21)


\begin{displaymath}Var(X)= \frac{2}{\lambda^{2}} - \frac{1}{\lambda^{2}} = \frac{1}{\lambda^{2}}
\end{displaymath} (22)

For X non-negative, integer-valued, and discrete, we can define the z-transform:

\begin{displaymath}G(z)=E[z^{X}] = \sum_{i=0}^{\infty} p(i) z^{i}
\end{displaymath} (23)

The first and second moments can be found as follows:

\begin{displaymath}E[X]=\frac{dG(z)}{dz}\vert _{z=1}
\end{displaymath} (24)


\begin{displaymath}E[X^{2}]=\frac{d^{2}G(z)}{dz^{2}}\vert _{z=1} + \frac{dG(z)}{dz}\vert _{z=1}
\end{displaymath} (25)

A property of transforms, known as the convolution theorem is stated as follows: Let $X_{1},X_{2}, \ldots, X_{n}$ be mutually independent random variables. Let $Y=\sum_{i=1}{n} X_{i}$. If $M_{X_{i}}(\theta)$ exists for all i, then $M_{Y}(\theta)$ exists, and:

\begin{displaymath}M_{Y}(\theta)= \prod_{i=1}^{n} M_{X_{i}}(\theta).
\end{displaymath} (26)

Example: Let X1 and X2 be independent exponentially distributed random variables with parameters $\lambda_{1}$ and $\lambda_{2}$respectively. Let Y = X1+X2. Find the distribution of Y.

The Laplace transforms for X1 and X2 are:

\begin{displaymath}L_{X_{1}}(s) = \frac{\lambda_{1}}{\lambda_{1}+s}
\end{displaymath} (27)


\begin{displaymath}L_{X_{2}}(s) = \frac{\lambda_{2}}{\lambda_{2}+s}
\end{displaymath} (28)

By the convolution theorem:

\begin{displaymath}L_{Y}(s) = \frac{\lambda_{1} \lambda{2}}{(\lambda_{1}+s)(\lambda_{2}+s)}
\end{displaymath} (29)

Expanding this into partial fractions:

\begin{displaymath}L_{Y}(s) = \frac{a_{1}\lambda_{1}}{(\lambda_{1}+s)}
+ \frac{a_{2}\lambda_{2}}{(\lambda_{2}+s)}
\end{displaymath} (30)

where:

\begin{displaymath}a_{1} = \frac{\lambda_{2}}{\lambda_{2}-\lambda_{1}}
\end{displaymath} (31)


\begin{displaymath}a_{2} = \frac{\lambda_{1}}{\lambda_{1}-\lambda_{2}}
\end{displaymath} (32)

Taking the inverse Laplace transform yields:

\begin{displaymath}f(y) = a_{1}\lambda_{1}e^{\lambda_{1}x} + a_{2}\lambda_{2}e^{\lambda_{2}x}.
\end{displaymath} (33)



1999-08-31