next up previous
Next: Normal Distribution Up: Class Notes Previous: Bernoulli and Related Distributions

Continuous Random Variables

Continuous random variables are variables that take values that could be any real number within some interval. One common example of such variables is time, for example, the time to failure of a system or the time to complete some task. Other examples include physical measurements such as length or diameter. As will be seen, continuous random variables also can be used to approximate discrete random variables.

To develop probability models for continuous r.v.'s, it is necessary to make one important restriction: we only consider events associated with these r.v.'s that are defined in terms of intervals of real numbers, including intersections and unions of intervals. Probability models are constructed by representing the probability that a r.v. is contained within an interval as the area under a curve over that interval. That curve is called the density function of the r.v. To satisfy the laws of probability, density functions must satisfy the following two conditions:


  1. \begin{displaymath}
f(t) \ge 0,\ \forall\ t,
\end{displaymath}


  2. \begin{displaymath}
\int_{-\infty}^\infty f(t)dt = 1.
\end{displaymath}

The second condition corresponds to the requirement that the probability of the entire sample space must be 1. Any function that satisfies these two conditions is the density function of some r.v.

The probability that the r.v. is contained within an interval is then

\begin{displaymath}
P(a < X \le b) = \int_a^b f(t)dt.
\end{displaymath}

Note that in the case of continuous r.v.'s,

\begin{displaymath}
P(a < X \le b) = P(a < X < b) = P(a\le X < b) = P(a \le X \le b),
\end{displaymath}

since the area under a curve at a point is 0. The distribution function of a continuous r.v. is given by

\begin{displaymath}
F(x) = P(X\le x) = \int_{-\infty}^x f(t)dt.
\end{displaymath}

Note that the Fundamental Theorem of Calculus implies that

\begin{displaymath}
f(x) = \frac{d}{dx}F(x).
\end{displaymath}

Also note that the value of a density function is not a probability; nor is a density necessarily bounded by 1. It can be thought of as the concentration of likelihood at a point.

The expected value of a continuous r.v. is defined analogously to the expected value of a discrete r.v. with the p.m.f. replaced by the density function and the sum replaced by an integral:

\begin{displaymath}
E(X) = \int_{-\infty}^\infty xf(x)dx.
\end{displaymath}

Also, the variance of a continuous r.v. is defined by

\begin{displaymath}
Var(X) = \int_{-\infty}^\infty (x - \mu)^2 f(x)dx,
\end{displaymath}

where $\mu = E(X)$. Note that the additive property of integrals gives

\begin{eqnarray*}
Var(X) &=& \int_{-\infty}^\infty (x^2 - 2\mu x + \mu^2) f(x)dx...
...int_{-\infty}^\infty x^2 f(x)dx - \mu^2 \\
&=& E(X^2) - \mu^2,
\end{eqnarray*}

where $\mu = E(X)$.

To construct probability models for continuous r.v.'s, it is only necessary to find a density function that models appropriately the concentration of likelihood.



Subsections
next up previous
Next: Normal Distribution Up: Class Notes Previous: Bernoulli and Related Distributions
Larry Ammann
2013-12-17