Convergence

Definition 239.1.

A sequence of random variables converges almost surely to a random variable if

Denoted by .

Definition 239.2.

A sequence of random variables converges in probability to a random variable if for all ,

Denoted by .

Definition 239.3.

A sequence of random variables with distribution functions converges in distribution to a random variable with distribution function if for all at which is continuous,

Denoted by .

The limit of a sequence of random variables is almost surely (that is, bar a set of measure zero) unique for almost sure convergence and for convergence in probability. In other words, we have or , then is uniquely determined for all but a set of measure zero. This is not the case for convergence in distribution; take an iid sequence for example:

Theorem 239.4.

Almost sure convergence implies convergence in probability.

Theorem 239.5.

Convergence in probability implies convergence in distribution.

Note that , where and are distributions, does not imply . The converse is true, however.

Theorem 239.6.

Let be distribution functions with densities and . If pointwise, then pointwise.

Theorem 239.7(Lemma).

If , , and

then .

Theorem 239.8.

Let be a sequence of random variables with characteristic functions . Then iff pointwise.