Convergence

Definition 1.

A sequence of random variables converges almost surely to a random variable if

Denoted by .

Definition 2.

A sequence of random variables converges in probability to a random variable if for all ,

Denoted by .

Definition 3.

A sequence of random variables with distribution functions converges in distribution to a random variable with distribution function if for all at which is continuous,

Denoted by .

The limit of a sequence of random variables is almost surely (that is, bar a set of measure zero) unique for almost sure convergence and for convergence in probability. In other words, we have or , then is uniquely determined for all but a set of measure zero. This is not the case for convergence in distribution; take an iid sequence for example:

Theorem 4.

Almost sure convergence implies convergence in probability.

Theorem 5.

Convergence in probability implies convergence in distribution.

Note that , where and are distributions, does not imply . The converse is true, however.

Theorem 6.

Let be distribution functions with densities and . If pointwise, then pointwise.

Theorem 7(Lemma).

If , , and

then .

Theorem 8.

Let be a sequence of random variables with characteristic functions . Then iff pointwise.