Expectation of continuous random variables

Definition 1.

Let be a continuous random variable with density . is said to have finite expectation if

in which case is defined to be

Theorem 2(Proposition).

Let be a positive continuous random variable with distribution and density . Then has finite expectation iff , in which case,

More generally, if is not positive,

(Yes, the sign ahead of the second integral in negative, not positive. Check your integration limits.)

Theorem 3(LOTUS).

Let be a continuous random variable with density . Let be such that is a continuous random variable. Then,

Vasanth's proof is incorrect!

Vasanth stated the theorem for a general function , and supplied the following proof:

The boxed steps are incorrect; the limits of the first integral in both cases must be and . The way to remedy this is to prove the theorem for a positive function , and then to use the result to prove the theorem for general .

Note that Vasanth erred here on another point; if is form to , the integration bounds still remain and .

Example 4.

Let . Clearly,

Thus,

since the integrand is an odd function.

Variance

We can now define variance to be , which simplifies to (assume linearity of expectation for the moment).

Example 5.

Let . . Note that

Using the LOTUS property with yields


Moments of continuous random variables

Defined analogously to moments of discrete random variables.

Example 6(Examples).

If ,

If , , so

If , . Thus, all even moments of exist, and we can compute them via the gamma density. In a bit, we will show that If a continuous random variable has a moment of order , then it has a moment of order for all , as we did for discrete random variables. It’ll then follow that all odd moments of also exist, and subsequently must be zero due to an odd integrand:


Joint distributions

We say the random vector has density if

and

Also, if is continuous at , then is differentiable at , and

Also note that

and

Example 7.

Here’s now you compute normalization factors:

Theorem 8.

Let be continuous random variables with joint distribution . Let such that is a continuous random variable. Then,

We will now prove linearity of expectation for a specific case. The result will be used to prove linearity of expectation generally.

Theorem 9.

Let be a continuous random variable with finite expectation. Define and (Clearly, and ). Then, .

So, if , we can write , where and are both non negative functions, and apply the previous theorem to obtain the general LOTUS property:

This allows us to prove the linearity of expectation.

Theorem 10(Linearity of expectation).

Let and be continuous random variables with finite expectation. Then, .


Independent continuous random variables

Definition 11.

Random variables and are independent if the events and are independent for all and , that is, .

It follows from the definition that and are independent iff .

If and are independent, it follows that

This implies, that for any two “reasonable” (I suppose measurable) sets and ,

Example 12.

Let the joint distribution of be given by

We can compute and :

Notice that . Thus, and are independent.

Example 13.

Let the joint distribution of be given by

Verify that this is indeed a density. Again, we can compute the marginals:

Notice that . and are not independent.

Example 14(Example: dimensional standard Gaussian).

The two dimensional standard Gaussian (which just means normal, btw), which is denoted by , is defined like so:

Its marginal densities are

Note that , so if , and are independent.

Example 15.

Let , , and are independent. Find .

If and are independent and and are functions form to , then and are independent.

Expectation of independent random variables

Theorem 16.

and are independent iff .