As the time index approaches infinity, a Markov chain
may settle down and exhibit steady-state behavior.
If the following limit exists:
Looking at the state probability as approaches infinity, we see that:
(1) | |||
When the limiting probabilities exist, then can be found using the
following equations: