m (Protected "ECE600 F13 Expectation mhossain" [edit=sysop:move=sysop]) |
|||
(15 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
[[Category:ECE600]] | [[Category:ECE600]] | ||
[[Category:Lecture notes]] | [[Category:Lecture notes]] | ||
+ | |||
+ | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> | ||
+ | [[ECE600_F13_rv_Functions_of_random_variable_mhossain|Previous Topic: Functions of a Random Variable]]<br/> | ||
+ | [[ECE600_F13_Characteristic_Functions_mhossain|Next Topic: Characteristic Functions]] | ||
+ | ---- | ||
+ | [[Category:ECE600]] | ||
+ | [[Category:probability]] | ||
+ | [[Category:lecture notes]] | ||
+ | [[Category:slecture]] | ||
<center><font size= 4> | <center><font size= 4> | ||
− | '''Random Variables and Signals''' | + | [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] |
</font size> | </font size> | ||
− | + | [https://www.projectrhea.org/learning/slectures.php Slectures] by [[user:Mhossain | Maliha Hossain]] | |
− | + | ||
− | + | <font size= 3> Topic 9: Expectation</font size> | |
+ | </center> | ||
+ | ---- | ||
+ | ---- | ||
− | Thus far, we have learned how to represent the probabilistic behavior | + | Thus far, we have learned how to represent the probabilistic behavior of a random variable X using the density function f<math>_X</math> or the mass function p<math>_X</math>. <br/> |
Sometimes, we want to describe X probabilistically using only a small number of parameters. The expectation is often used to do this. | Sometimes, we want to describe X probabilistically using only a small number of parameters. The expectation is often used to do this. | ||
'''Definition''' <math>\qquad</math> the '''expected value''' of continuous random variable X is defined as <br/> | '''Definition''' <math>\qquad</math> the '''expected value''' of continuous random variable X is defined as <br/> | ||
− | <center><math>E[X | + | <center><math>E[X] = \int_{-\infty}^{\infty}xf_X(x)dx</math></center> |
Line 27: | Line 38: | ||
*E[X] is also known as the mean of X. Other notation for E[X] include:<br/> | *E[X] is also known as the mean of X. Other notation for E[X] include:<br/> | ||
<center><math> EX,\;\overline{X},\;m_X,\;\mu_X</math></center> | <center><math> EX,\;\overline{X},\;m_X,\;\mu_X</math></center> | ||
− | *The equation defining E[X] for discrete X could have been derived from | + | *The equation defining E[X] for discrete X could have been derived from that for continuous X, using the density function f<math>_X</math> containing <math>\delta</math>-functions. |
'''Example''' <math>\qquad</math> X is an exponential random variable. find E[X].<br/> | '''Example''' <math>\qquad</math> X is an exponential random variable. find E[X].<br/> | ||
Line 46: | Line 57: | ||
&=\frac{1}{n}\sum_{k=1}^n k \\ | &=\frac{1}{n}\sum_{k=1}^n k \\ | ||
\\ | \\ | ||
− | &= \frac{1}{n}(\frac{1}{2})(n)(n+1) \\ | + | &= \frac{1}{n}\left(\frac{1}{2}\right)(n)(n+1) \\ |
\\ | \\ | ||
&=\frac{n+1}{2} | &=\frac{n+1}{2} | ||
Line 59: | Line 70: | ||
<math>E[Y] = \sum_{y\in\mathcal R_Y}yp_Y(y)</math></center> | <math>E[Y] = \sum_{y\in\mathcal R_Y}yp_Y(y)</math></center> | ||
− | We can find | + | We can find the expectation of Y by first finding f<math>_Y</math> or p<math>_Y</math> in terms of g and f<math>_X</math> or p<math>_X</math>. Alternatively, it can be shown that <br/> |
<center><math>E[Y]=E[g(X)]=\int_{-\infty}^{\infty}g(x)f_X(x)dx</math><br/> | <center><math>E[Y]=E[g(X)]=\int_{-\infty}^{\infty}g(x)f_X(x)dx</math><br/> | ||
or<br/> | or<br/> | ||
<math>E[Y] = E[g(X)]=\sum_{y\in\mathcal R_X}g(x)p_X(x)</math></center> | <math>E[Y] = E[g(X)]=\sum_{y\in\mathcal R_X}g(x)p_X(x)</math></center> | ||
− | See Papoulis for | + | See Papoulis for a proof of the above. |
Two important cases or functions g: | Two important cases or functions g: | ||
Line 73: | Line 84: | ||
<math> E[g(X)] = \sum_{x\in\mathcal R_x}(x-\mu_X)^2p_X(x)</math></center> | <math> E[g(X)] = \sum_{x\in\mathcal R_x}(x-\mu_X)^2p_X(x)</math></center> | ||
− | '''Note:''' <math>\qquad</math> E[(X - <math>\mu_X)^2</math>] is called the '''variance''' of X and is often denoted <math>\sigma_X</math><math>^2</math>. <math>\sigma_X</math> is called the standard deviation of X. | + | '''Note:''' <math>\qquad</math> E[(X - <math>\mu_X)^2</math>] is called the '''variance''' of X and is often denoted <math>\sigma_X</math><math>^2</math>. The positive square root, denoted <math>\sigma_X</math>, is called the standard deviation of X. |
Important property of E[]:<br/> | Important property of E[]:<br/> | ||
− | Let g<math>_1</math>:'''R''' → '''R'''; g<math>_2</math>:'''R''' → '''R'''; <math>\alpha,\beta</math> ∈ '''R''' | + | Let g<math>_1</math>:'''R''' → '''R'''; g<math>_2</math>:'''R''' → '''R'''; <math>\alpha,\beta</math> ∈ '''R'''. Then <br/> |
<center><math>E[\alpha g_1(X) +\beta g_2(X)] = \alpha E[g_1(X)]+\beta E[g_2(X)] \ </math></center> | <center><math>E[\alpha g_1(X) +\beta g_2(X)] = \alpha E[g_1(X)]+\beta E[g_2(X)] \ </math></center> | ||
− | So E[] is a linear operator. The proof follows from the linearity of integration. | + | So E[] is a linear operator. The [[Lineariy_of_expectation_proof_mhossain|proof]] follows from the linearity of integration. |
Important property of Var():<br/> | Important property of Var():<br/> | ||
Line 84: | Line 95: | ||
Proof: | Proof: | ||
<center><math>\begin{align} | <center><math>\begin{align} | ||
− | E[(X-\ | + | E[(X-\mu_X)^2]&=E[X^2-2X\mu_X+\mu_X^2] \\ |
&=E[X^2]-2\mu_XE[X]+E[\mu_X^2] \\ | &=E[X^2]-2\mu_XE[X]+E[\mu_X^2] \\ | ||
&=E[X^2]-2\mu_X^2+\mu_X^2 \\ | &=E[X^2]-2\mu_X^2+\mu_X^2 \\ | ||
Line 91: | Line 102: | ||
− | '''Example''' <math>\qquad</math> X is Gaussian N(<math>\mu,\sigma^2</math>). Find E[X | + | '''Example''' <math>\qquad</math> X is Gaussian N(<math>\mu,\sigma^2</math>). Find E[X] and Var(X).<br/> |
<center><math>E[X] = \int_{-\infty}^{\infty}\frac{x}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx</math></center> | <center><math>E[X] = \int_{-\infty}^{\infty}\frac{x}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx</math></center> | ||
Let r = x - <math>\mu</math>. Then <br/> | Let r = x - <math>\mu</math>. Then <br/> | ||
Line 100: | Line 111: | ||
<center><math>E[X^2] = \int_{-\infty}^{\infty}\frac{x^2}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx</math></center> | <center><math>E[X^2] = \int_{-\infty}^{\infty}\frac{x^2}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx</math></center> | ||
− | Using integration by parts, we see that this integral evaluates to <math>\sigma^2+\mu^2</math>. So, <br/> | + | Using integration by parts (proof), we see that this integral evaluates to <math>\sigma^2+\mu^2</math>. So, <br/> |
<center><math>Var(X) = \sigma^2+\mu^2-\mu^2 = \sigma^2</math></center> | <center><math>Var(X) = \sigma^2+\mu^2-\mu^2 = \sigma^2</math></center> | ||
Line 107: | Line 118: | ||
<center><math> \begin{align} | <center><math> \begin{align} | ||
E[X] &= \sum_{k=0}^{\infty}k\frac{e^{-\lambda}\lambda^k}{k!} \\ | E[X] &= \sum_{k=0}^{\infty}k\frac{e^{-\lambda}\lambda^k}{k!} \\ | ||
− | &= \sum_{k=1}^{\infty} | + | &= \sum_{k=1}^{\infty}\frac{e^{-\lambda}\lambda^k}{(k-1)!} \\ |
&= \lambda\sum_{k=0}^{\infty}e^{-\lambda}\frac{\lambda^k}{k!} \\ | &= \lambda\sum_{k=0}^{\infty}e^{-\lambda}\frac{\lambda^k}{k!} \\ | ||
&= \lambda | &= \lambda | ||
Line 124: | Line 135: | ||
So, <br/> | So, <br/> | ||
<math>E[X^2] = \lambda^2 +\lambda \ </math><br/> | <math>E[X^2] = \lambda^2 +\lambda \ </math><br/> | ||
− | <math>\Rightarrow Var(X) = \lambda^2 +\lambda - \lambda = \lambda \ </math> | + | <math>\Rightarrow Var(X) = \lambda^2 +\lambda - \lambda^2 = \lambda \ </math> |
Line 133: | Line 144: | ||
Moments generalize mean and variance to nth order expectations. | Moments generalize mean and variance to nth order expectations. | ||
− | '''Definition''' <math>\ | + | '''Definition''' <math>\qquad</math> the '''nth moment''' of random variable X is<br/> |
<center><math>\mu_n\equiv E[X^n]=\int_{-\infty}^{\infty}x^nf_X(x)dx\quad n=1,2,...</math></center> | <center><math>\mu_n\equiv E[X^n]=\int_{-\infty}^{\infty}x^nf_X(x)dx\quad n=1,2,...</math></center> | ||
Line 162: | Line 173: | ||
Then, <br/> | Then, <br/> | ||
<center><math>\begin{align} | <center><math>\begin{align} | ||
− | E[X|X>\mu] &=\int_{\mu}^{\infty}\frac{x}{\mu}e^{-\frac{x-\mu}{\mu}} | + | E[X|X>\mu] &=\int_{\mu}^{\infty}\frac{x}{\mu}e^{-\frac{x-\mu}{\mu}} \\ |
&=2\mu | &=2\mu | ||
\end{align}</math></center> | \end{align}</math></center> | ||
Line 186: | Line 197: | ||
---- | ---- | ||
− | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]] | + | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> |
+ | [[ECE600_F13_rv_Functions_of_random_variable_mhossain|Previous Topic: Functions of a Random Variable]]<br/> | ||
+ | [[ECE600_F13_Characteristic_Functions_mhossain|Next Topic: Characteristic Functions]] |
Latest revision as of 11:12, 21 May 2014
Back to all ECE 600 notes
Previous Topic: Functions of a Random Variable
Next Topic: Characteristic Functions
The Comer Lectures on Random Variables and Signals
Topic 9: Expectation
Thus far, we have learned how to represent the probabilistic behavior of a random variable X using the density function f$ _X $ or the mass function p$ _X $.
Sometimes, we want to describe X probabilistically using only a small number of parameters. The expectation is often used to do this.
Definition $ \qquad $ the expected value of continuous random variable X is defined as
Definition $ \qquad $ the expected value of discrete random variable X is defined as
where $ R_X $ is the range space of X.
Note:
- E[X] is also known as the mean of X. Other notation for E[X] include:
- The equation defining E[X] for discrete X could have been derived from that for continuous X, using the density function f$ _X $ containing $ \delta $-functions.
Example $ \qquad $ X is an exponential random variable. find E[X].
Let $ \mu = 1/\lambda $. We often write
Example $ \qquad $ X is a uniform discrete random varibable with $ R_X $ = {1,...,n}. Then,
Having defined E[X], we will now consider more general E[g(X)] for a function g:R → R.
Let Y = g(X). What is E[Y]? From previous definitions:
or
We can find the expectation of Y by first finding f$ _Y $ or p$ _Y $ in terms of g and f$ _X $ or p$ _X $. Alternatively, it can be shown that
or
See Papoulis for a proof of the above.
Two important cases or functions g:
- g(x) = x. Then E[g(X)] = E[X]
- g(x) = (x - $ \mu_X)^2 $. Then E[g(X)] = E[(X - $ \mu_X)^2 $]
or
Note: $ \qquad $ E[(X - $ \mu_X)^2 $] is called the variance of X and is often denoted $ \sigma_X $$ ^2 $. The positive square root, denoted $ \sigma_X $, is called the standard deviation of X.
Important property of E[]:
Let g$ _1 $:R → R; g$ _2 $:R → R; $ \alpha,\beta $ ∈ R. Then
So E[] is a linear operator. The proof follows from the linearity of integration.
Important property of Var():
Proof:
Example $ \qquad $ X is Gaussian N($ \mu,\sigma^2 $). Find E[X] and Var(X).
Let r = x - $ \mu $. Then
First term: Integrating an odd function over (-∞,∞) ⇒ first term is 0.
Second term: Integrating a Gaussian pdf over (-∞,∞) gives one ⇒ second term is $ \mu $.
So E[X] = $ \mu $
Using integration by parts (proof), we see that this integral evaluates to $ \sigma^2+\mu^2 $. So,
Example $ \qquad $ X is Poisson with parameter $ \lambda $. Find E[X] and Var(X).
So,
$ E[X^2] = \lambda^2 +\lambda \ $
$ \Rightarrow Var(X) = \lambda^2 +\lambda - \lambda^2 = \lambda \ $
Moments
Moments generalize mean and variance to nth order expectations.
Definition $ \qquad $ the nth moment of random variable X is
and the nth central moment of X is
So
- $ \mu_1 $ = E[X] mean
- $ \mu_2 $ = E[X$ ^2 $] mean-square
- v$ _2 $ = Var(X) variance
Conditional Expectation
For an event M ∈ F with P(M) > 0.
or
Example $ \qquad $ X is an exponential random variable. Let M = {X > $ \mu $}. Find E[X|M]. Note that P(M) = P(X > $ \mu $) and since $ \mu $ > 0,
It can be shown that
Then,
References
- M. Comer. ECE 600. Class Lecture. Random Variables and Signals. Faculty of Electrical Engineering, Purdue University. Fall 2013.
Questions and comments
If you have any questions, comments, etc. please post them on this page
Back to all ECE 600 notes
Previous Topic: Functions of a Random Variable
Next Topic: Characteristic Functions