(New page: Category:math Category:tutorial Category:probability ==Theorem== The expectation function is a linear function. i.e.<br/> <math>E[aX + bY] = aE[X] + bE[Y] \ </math> <br/> whe...)
 
Line 55: Line 55:
 
&= aE[X]+bE[Y]_{\blacksquare}
 
&= aE[X]+bE[Y]_{\blacksquare}
 
\end{align}</math>
 
\end{align}</math>
 +
  
 
----
 
----

Revision as of 04:18, 11 June 2013


Theorem

The expectation function is a linear function. i.e.
$ E[aX + bY] = aE[X] + bE[Y] \ $
where $ E $ is the expectation function, $ X $ and $ Y $ are random variables with distribution functions $ f_X(x) $ and $ f_Y(y) $ respectively and $ a $ and $ b $ are arbitrary real constants.

Note that in the following proof, we make no assumptions about the independence of $ X $ with respect to $ Y $ so the above statement holds regardless of whether $ X $ and $ Y $ are independent or not.



Proof

If $ X $ and $ Y $ are two continuous random variables, then by definition of the expected value
$ \begin{align} E[aX + bY] &= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}(ax + by)f_{X,Y}(x,y)dxdy \\ &= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}[axf_{X,Y}(x,y) + byf_{X,Y}(x,y)] dxdy \\ &= a\int_{-\infty}^{\infty}x[\int_{-\infty}^{\infty}f_{X,Y}(x,y)dy]dx + b\int_{-\infty}^{\infty}y[\int_{-\infty}^{\infty}f_{X,Y}(x,y)dx]dy \end{align} $

By the definition of marginal probability density function, we know that
$ \int_{-\infty}^{\infty}f_{X,Y}(x,y)dy = f_X(x) $
and similarly,
$ \int_{-\infty}^{\infty}f_{X,Y}(x,y)dx = f_Y(y) $

Therefore,
$ \begin{align} E[aX + bY] &= a\int_{-\infty}^{\infty}x[\int_{-\infty}^{\infty}f_{X,Y}(x,y)dy]dx + b\int_{-\infty}^{\infty}y[\int_{-\infty}^{\infty}f_{X,Y}(x,y)dx]dy \\ &= a\int_{-\infty}^{\infty}xf_X(x)dx + b\int_{-\infty}^{\infty}yf_Y(y)dy \\ &= aE[X]+bE[Y]_{\blacksquare} \end{align} $


For the case when $ X $ and $ Y $ are two discrete random variables, the proof is very similar to the one above. By definition of the expected value, we have that
$ \begin{align} E[aX + bY] &= \sum_x \sum_y(ax + by)f_{X,Y}(x,y) \\ &= \sum_x \sum_y [axf_{X,Y}(x,y) + byf_{X,Y}(x,y)] \\ &= a\sum_x x\sum_y f_{X,Y}(x,y) + b\sum_y y \sum_x f_{X,Y}(x,y) \end{align} $

By the definition of marginal probability density function, we know that
$ \sum_y f_{X,Y}(x,y) = f_X(x) \ $
and similarly,
$ \sum_x f_{X,Y}(x,y) = f_Y(y) \ $

Therefore,
$ \begin{align} E[aX + bY] &= a\sum_x x\sum_y f_{X,Y}(x,y) + b\sum_y y \sum_x f_{X,Y}(x,y) \\ &= a\sum_x xf_X(x) + b\sum_y yf_Y(y) \\ &= aE[X]+bE[Y]_{\blacksquare} \end{align} $



Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood