Line 6: | Line 6: | ||
</font size> | </font size> | ||
− | <font size= 3> Topic | + | <font size= 3> Topic 9: Excpectation</font size> |
</center> | </center> | ||
Line 131: | Line 131: | ||
==Moments== | ==Moments== | ||
+ | Moments generalize mean and variance to nth order expectations. | ||
+ | '''Definition''' <math>\qquadd</math> the '''nth order moment''' of random variable X is<br/> | ||
+ | <center><math>\mu_n\equiv E[X^n]=\int_{-\infty}^{\infty}x^nf_X(x)dx\quad n=1,2,...</math></center> | ||
+ | |||
+ | and the '''nth central moment''' of X is <br/> | ||
+ | <center><math>v_n\equiv E[(X-\mu_X)^n] = \int_{-\infty}^{\infty}(x-\mu_X)^nf_X(x)dx\qquad n = 2,3,...</math></center> | ||
+ | |||
+ | So <br/> | ||
+ | *<math>\mu_1</math> = E[X] mean | ||
+ | *<math>\mu_2</math> = E[X<math>^2</math>] mean-square | ||
+ | *v<math>_2</math> = Var(X) variance | ||
Line 137: | Line 148: | ||
==Conditional Expectation== | ==Conditional Expectation== | ||
+ | |||
+ | For an event M ∈ ''F'' with P(M) > 0. <br/> | ||
+ | <center><math>E[g(X)|M] = \int_{-\infty}^{\infty}g(x)f_X(x|M)dx</math><br/> | ||
+ | or<br/> | ||
+ | <math>E[g(X)|M] = \sum_{x\in\mathcal R_x}g(x)p_X(x|M)dx</math></center> | ||
+ | |||
+ | |||
+ | '''Example''' <math>\qquad</math> X is an exponential random variable. Let M = {X > <math>\mu</math>}. Find E[X|M]. Note that P(M) = P(X > <math>\mu</math>) and since <math>\mu</math> > 0, <br/> | ||
+ | <center><math>P(M) = P(X>\mu) =\int_{\mu}^{\infty}\frac{1}{\mu}e^{-\frac{x}{\mu}}dx \;>\;0</math></center> | ||
+ | |||
+ | It can be shown that <br/> | ||
+ | <center><math>f_X(x|X>\mu) = \frac{1}{\mu}e^{-\frac{x-\mu}{\mu}}u(x-\mu)</math></center> | ||
+ | Then, <br/> | ||
+ | <center><math>\begin{align} | ||
+ | E[X|X>\mu] &=\int_{\mu}^{\infty}\frac{x}{\mu}e^{-\frac{x-\mu}{\mu}}dx \\ | ||
+ | &=2\mu | ||
+ | \end{align}</math></center> | ||
+ | |||
+ | <center>[[Image:fig1_expectation.png|400px|thumb|left|Fig 1: Conditional Expectation]]</center> | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | == References == | ||
+ | |||
+ | * [https://engineering.purdue.edu/~comerm/ M. Comer]. ECE 600. Class Lecture. [https://engineering.purdue.edu/~comerm/600 Random Variables and Signals]. Faculty of Electrical Engineering, Purdue University. Fall 2013. | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]] |
Revision as of 21:45, 21 October 2013
Random Variables and Signals
Topic 9: Excpectation
Thus far, we have learned how to represent the probabilistic behavior or random variables X using the density function f$ _X $ or the mass function p$ _X $.
Sometimes, we want to describe X probabilistically using only a small number of parameters. The expectation is often used to do this.
Definition $ \qquad $ the expected value of continuous random variable X is defined as
Definition $ \qquad $ the expected value of discrete random variable X is defined as
where $ R_X $ is the range space of X.
Note:
- E[X] is also known as the mean of X. Other notation for E[X] include:
- The equation defining E[X] for discrete X could have been derived from the continuous X, using the density function f$ _X $ containing $ \delta $-functions.
Example $ \qquad $ X is an exponential random variable. find E[X].
Let $ \mu = 1/\lambda $. We often write
Example $ \qquad $ X is a uniform discrete random varibable with $ R_X $ = {1,...,n}. Then,
Having defined E[X], we will now consider more general E[g(X)] for a function g:R → R.
Let Y = g(X). What is E[Y]? From previous definitions:
or
We can find this by first finding f$ _Y $ or p$ _Y $ in terms of g and f$ _X,/math> or p<math>_X $. Alternatively, it can be shown that
or
See Papoulis for the proof of the above.
Two important cases or functions g:
- g(x) = x. Then E[g(X)] = E[X]
- g(x) = (x - $ \mu_X)^2 $. Then E[g(X)] = E[(X - $ \mu_X)^2 $]
or
Note: $ \qquad $ E[(X - $ \mu_X)^2 $] is called the variance of X and is often denoted $ \sigma_X $$ ^2 $. $ \sigma_X $ is called the standard deviation of X.
Important property of E[]:
Let g$ _1 $:R → R; g$ _2 $:R → R; $ \alpha,\beta $ ∈ R, Then
So E[] is a linear operator. The proof follows from the linearity of integration.
Important property of Var():
Proof:
Example $ \qquad $ X is Gaussian N($ \mu,\sigma^2 $). Find E[X} and Var(X).
Let r = x - $ \mu $. Then
First term: Integrating an odd function over (-∞,∞) ⇒ first term is 0.
Second term: Integrating a Gaussian pdf over (-∞,∞) gives one ⇒ second term is $ \mu $.
So E[X] = $ \mu $
Using integration by parts, we see that this integral evaluates to $ \sigma^2+\mu^2 $. So,
Example $ \qquad $ X is Poisson with parameter $ \lambda $. Find E[X] and Var(X).
So,
$ E[X^2] = \lambda^2 +\lambda \ $
$ \Rightarrow Var(X) = \lambda^2 +\lambda - \lambda = \lambda \ $
Moments
Moments generalize mean and variance to nth order expectations.
Definition $ \qquadd $ the nth order moment of random variable X is
and the nth central moment of X is
So
- $ \mu_1 $ = E[X] mean
- $ \mu_2 $ = E[X$ ^2 $] mean-square
- v$ _2 $ = Var(X) variance
Conditional Expectation
For an event M ∈ F with P(M) > 0.
or
Example $ \qquad $ X is an exponential random variable. Let M = {X > $ \mu $}. Find E[X|M]. Note that P(M) = P(X > $ \mu $) and since $ \mu $ > 0,
It can be shown that
Then,
References
- M. Comer. ECE 600. Class Lecture. Random Variables and Signals. Faculty of Electrical Engineering, Purdue University. Fall 2013.