Random Variables and Signals
Topic 14: Joint Expectation
Joint Expectation
Given random variables X and Y, let Z = g(X,Y) for some g:R$ _2 $→R. Then E[Z] can be computed using f$ _Z $(z) or p$ _Z $(z) in the original definition of E[ ]. Or, we can use
or
The proof is in Papoulis.
We will use joint expectation to define some important moments that help characterize the joint behavior of X and Y.
Note that joint expectation is a linear operator, so if g$ _1 $,...,g$ _n $ are n functions from R$ ^2 $ to R and a$ _1 $,...,a$ _n $ are constants, then
Important Moments for X,Y
We still need $ \mu_X $, $ \mu_Y $, $ \sigma_X $$ ^2 $, $ \sigma_X $$ ^2 $, the means and variances of X and Y. Other moments that are of great interest are:
- The correlation between X and Y is
- The covariance of X and Y
- The correlation coefficient of X and Y is
- Note:
- |r$ _{XY} $| ≤ 1 (proof)
- If X and Y are independent, then r$ _{XY} $ = 0. The converse is not true in general.
- If r$ _{XY} $=0, then X and Y are said to be uncorrelated. It can be shown that
- X and Yare uncorrelated iff Cov(X,Y)=0 (proof).
- X and Y are uncorrelated iff E[XY] = $ \mu_X\mu_Y $
- X and Y are orthogonal if E[XY]=0.
The Cauchy Schwarz Inequality
For random variables X and Y,
with equality iff Y = a$ _0 $ with probability 1, where a$ _0 $ is a constant. Note that "equality with probability 1" will be defined later.
Proof We start by considering
which is a quadratic function of a ∈ R. Consider two cases:
- (i)$ \; $ E[(aX-Y)^2] > 0
- (ii) E[(aX-Y)^2] = 0
Case (i):
Since this quadratic is greater than 0 for all a, there are no real roots. There must be two complex roots. From the quadratic equation, this means that <\br>
Case (ii):
In this case
This means that ∃a ∈ R such that
It can be shown that if a random variable X has E[X$ ^2 $]=0, then X=0 except possibly on a set of probability 0. Note that previously, we have defined equality between random variables X and Y to mean
with the same notion of equality, we would have that