Conditional Expectation One Discrete One Continuous
The conditional expectation (or conditional expected value, or conditional mean) is the expected value of a random variable, computed with respect to a conditional probability distribution.
Table of contents
-
A pragmatic approach
-
Definition
-
Conditional expectation of a discrete random variable
-
Conditional expectation of a continuous random variable
-
Conditional expectation in general
-
Properties of conditional expectation
-
Law of iterated expectations
-
Solved exercises
-
Exercise 1
-
Exercise 2
-
Exercise 3
-
As in the case of the expected value, a completely rigorous definition of the conditional expectation requires a complicated mathematical apparatus.
To make things simpler, we do not give a completely rigorous definition in this lecture. We rather give an informal definition and we show how the conditional expectation can be computed.
In particular, we discuss how to calculate the conditional expected value of a random variable when we observe the realization of another random variable , that is, when we receive the information that .
The following informal definition is very similar to our previous definition of the expected value.
Definition Let and be two random variables. The conditional expectation of given is the weighted average of the values that can take on, where each possible value is weighted by its respective conditional probability (conditional on the information that ).
The expectation of a random variable conditional on is denoted by
We start with the case in which and are two discrete random variables and, considered together, they form a discrete random vector.
The formula for the conditional mean of given is a straightforward implementation of the above informal definition: the weights of the average are given by the conditional probability mass function of .
Definition Let and be two discrete random variables. Let be the support of and let be the conditional probability mass function of given . The conditional expectation of given is provided that
If you do not understand the symbol and the finiteness condition above (absolute summability) go back to the lecture on the Expected value, where they are explained.
Example Let the support of the random vector be and its joint probability mass function be Let us compute the conditional probability mass function of given . The marginal probability mass function of evaluated at is The support of is Thus, the conditional probability mass function of given is The conditional expectation of given is
Let us now tackle the case in which and are continuous random variables, forming a continuous random vector.
The formula for the conditional mean of given involves an integral, which can be thought of as the limiting case of the summation found in the discrete case above.
Definition Let and be two continuous random variables. Let be the support of and let be the conditional probability density function of given . The conditional expectation of given is provided that
If you do not understand why an integration is required and why the finiteness condition above (absolute integrability) is imposed, you can find an explanation in the lecture entitled Expected value.
Example Let the support of the random vector be and its joint probability density function be Let us compute the conditional probability density function of given . The support of is When , the marginal probability density function of is ; when , the marginal probability density function is Thus, the marginal probability density function of is When evaluated at , it is The support of is Thus, the conditional probability density function of given is The conditional expected value of given is
The general formula for the conditional expectation of given does not require that the two variables form a discrete or a continuous random vector, but it is applicable to any random vector.
Definition Let be the conditional distribution function of given . The conditional expectation of given is where the integral is a Riemann-Stieltjes integral and the expected value exists and is well-defined only as long as the integral is well-defined.
The above formula follows the same logic of the formula for the expected value with the only difference that the unconditional distribution function has now been replaced with the conditional distribution function .
If you are puzzled by these formulae, you can go back to the lecture on the Expected value, which provides an intuitive introduction to the Riemann-Stieltjes integral.
From the above sections, it should be clear that the conditional expectation is computed exactly as the expected value, with the only difference that probabilities and probability densities are replaced by conditional probabilities and conditional probability densities.
Therefore, the properties enjoyed by the expected value, such as linearity, are also enjoyed by the conditional expectation.
Before knowing the realization of , the conditional expectation of given is unknown and can itself be regarded as a random variable. We denote it by .
In other words, is a random variable such that its realization equals when is the realization of .
This random variable satisfies a very important property, known as law of iterated expectations (or tower property):
Proof
For discrete random variables this is proved as follows: For continuous random variables the proof is analogous:
Below you can find some exercises with explained solutions.
Exercise 1
Let be a discrete random vector with support and joint probability mass function
What is the conditional expectation of given ?
Solution
Let us compute the conditional probability mass function of given . The marginal probability mass function of evaluated at is The support of is Thus, the conditional probability mass function of given is The conditional expectation of given is
Exercise 2
Suppose that is a continuous random vector with support and joint probability density function
Compute the expected value of conditional on .
Solution
We first need to compute the conditional probability density function of given , by using the formula Note that, by using indicator functions, we can write The marginal probability density function is obtained by marginalizing the joint density: When evaluated at , it is Furthermore, Thus, the conditional probability density function of given is The conditional expectation of given is
Exercise 3
Let and be two random variables.
Remember that the variance of can be computed as
In a similar manner, the conditional variance of , given , can be defined as
Use the law of iterated expectations to prove that
Solution
Please cite as:
Taboga, Marco (2021). "Conditional expectation", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/conditional-expectation.
stephensonthation.blogspot.com
Source: https://www.statlect.com/fundamentals-of-probability/conditional-expectation
0 Response to "Conditional Expectation One Discrete One Continuous"
Post a Comment