(→Other Useful Things) |
|||
(70 intermediate revisions by 18 users not shown) | |||
Line 1: | Line 1: | ||
+ | [[Category:ECE302Fall2008_ProfSanghavi]] | ||
+ | [[Category:probabilities]] | ||
+ | [[Category:ECE302]] | ||
+ | [[Category:cheat sheet]] | ||
+ | |||
+ | =[[ECE302]] Cheat Sheet number 2= | ||
+ | |||
==Cumulative Density Function (CDF)== | ==Cumulative Density Function (CDF)== | ||
− | * <math>F_X(x) = P[X | + | * <math>F_X(x) = P[X \leq x] = \int_{-\infty}^{\infty} f_x(t)dt</math> |
− | * <math>1 - F_X(x) = P[X > x]</math> | + | * <math>1 - F_X(x) = P[X > x]\!</math> |
+ | |||
+ | <math>\lim_{x\rightarrow-\infty}F_X(x) = 0 </math> | ||
+ | |||
+ | <math>\lim_{x\rightarrow\infty}F_X(x) = 1 </math> | ||
+ | |||
+ | '''If X is discrete PX(k)''' = P(X<= k)-P(X<=k-1) | ||
+ | = FX(k)-FX(k-1) | ||
− | + | '''Converting from CDF -> PDF :''' | |
− | + | fX(x) = d FX(x)/dt i.e. Take derivative of the CDF to get PDF | |
− | + | ||
==Exponential RV== | ==Exponential RV== | ||
+ | *The first occurance of a very rare event, when trials happen very fast. | ||
PDF: f<sub>X</sub>(x) = <math>\lambda*e^{-\lambda*x}</math>, x >= 0 ; f<sub>X</sub>(x) = 0 , else | PDF: f<sub>X</sub>(x) = <math>\lambda*e^{-\lambda*x}</math>, x >= 0 ; f<sub>X</sub>(x) = 0 , else | ||
Line 20: | Line 34: | ||
*Parameters: | *Parameters: | ||
− | <math>E[X]=\mu</math> | + | <math>E[X]=\mu\!</math><br> |
− | <math>Var[X]=\sigma^2</math> | + | <math>Var[X]=\sigma^2\!</math> |
<math>f_X(x)=\frac{1}{\sqrt{2\pi\sigma^2}}e^{\frac{-(x-\mu)^2}{2\sigma^2}}</math> | <math>f_X(x)=\frac{1}{\sqrt{2\pi\sigma^2}}e^{\frac{-(x-\mu)^2}{2\sigma^2}}</math> | ||
+ | For 2 independent Gaussians: | ||
+ | |||
+ | <math>Z=X+Y\!</math><br> | ||
+ | <math>E[Z]=\mu_X +\mu_Y\!</math><br> | ||
+ | <math>Var(Z)=\sigma^2_X+\sigma^2_Y</math> | ||
==PDF Properties== | ==PDF Properties== | ||
Line 33: | Line 52: | ||
* For any subset B of the real line, | * For any subset B of the real line, | ||
<math> P(X\in B) = \int\limits_Bf_X(x)dx </math> | <math> P(X\in B) = \int\limits_Bf_X(x)dx </math> | ||
+ | * For Continuous Random Variable: | ||
+ | P(X > x) = <math> \int\limits_{x}^{\infty}f_X(x)dx </math> | ||
+ | P(X <= x) = <math> \int\limits_{-\infty}^{x}f_X(x)dx </math> | ||
==Theorem of Total Probability for Continuous Random Variables== | ==Theorem of Total Probability for Continuous Random Variables== | ||
+ | *<math>f_Y(y) = f_{Y|A}(y)P(A) + f_{Y|B}(y)P(B)\,</math> | ||
+ | *<math>f_X(x) = \int^\infty_{-\infty}f_{XY}(x,y)dy = \int^\infty_{-\infty}f_{X|Y}(x|y)f_Y(y)dy \,</math> | ||
+ | |||
+ | |||
+ | *<math>f_Y(y) = f_{Y|A1}(y)P(A1) + f_{Y|A2}(y)P(A2) + f_{Y|A3}(y)P(A3)+ ... + f_{Y|Ai}(y)P(Ai)\,</math> if A1, A2, A3,... is disjoint | ||
==Conditioning a Random variable on an Event== | ==Conditioning a Random variable on an Event== | ||
+ | <math>f_{X|Y}(x)=P(X=x|A)=\frac {P({X=x}\cap A)}{P(A)}</math> | ||
+ | |||
+ | The events <math>{X=x}\cap A </math> are disjoint for different values of x, their union is A, and,therefore, | ||
+ | |||
+ | <math> P(A)=\sum_xP({X=x}\cap A)</math> | ||
+ | |||
+ | <math> \sum_xP_{x|A}(x)=1 </math> | ||
==Conditioning a Random variable on another Random variable== | ==Conditioning a Random variable on another Random variable== | ||
+ | <math>f_{X|Y}(x|y)=\dfrac {f_{XY}(x,y)}{f_{Y}(y)}</math> | ||
==Shifting and Scaling of Random Variables== | ==Shifting and Scaling of Random Variables== | ||
− | = | + | Let <math>Y=aX+b \,</math> |
+ | |||
+ | |||
+ | <math>f_Y(y)=\dfrac{1}{|a|}f_Y(\dfrac{y-b}{a})\!</math> | ||
+ | |||
+ | *<math>E[Y] = aE[X]+b \,</math> | ||
+ | |||
+ | *<math>Var(X) = a^2 E[X^2] \,</math> | ||
+ | |||
+ | == Addition of Continuous Random Variables == | ||
+ | |||
+ | If X and Y are '''continuous''' and independent random variables and Z = X + Y then | ||
+ | |||
+ | |||
+ | |||
+ | *<math>f_Z(z) = \int^\infty_{-\infty} f_X(x)f_Y(z-x) dx</math> | ||
+ | |||
+ | == Addition of Discrete Random Variables == | ||
+ | |||
+ | If X and Y are '''discrete''' and independent random variables and Z = X + Y then | ||
+ | |||
+ | '''<math>f_Z(z) = \sum_X f_X(x)f_Y(z-x)</math>''' | ||
+ | |||
+ | == Continuous Bayes' rule: == | ||
+ | |||
+ | |||
+ | '''<math>f_{X|Y}(x|y)=((f_X(x)).f_{Y|X}(y|x))/f_Y(y)</math> ''' | ||
==Other Useful Things== | ==Other Useful Things== | ||
− | *<math>E[X] = \int\ | + | If X and Y are indepdent of each other, then |
+ | *<math>E[XY] = E[X]E[Y]\!</math> | ||
+ | |||
+ | *<math>E[X] = \int^\infty_{-\infty}x*f_X(x)dx\!</math> | ||
+ | *<math>Var(X) = E[X^2] - (E[X])^2\!</math> | ||
+ | |||
+ | Marginal Probability Density Functions: | ||
+ | |||
+ | *<math>f_X(x) = \int^\infty_{-\infty} f_{XY}(x,y) dy</math> | ||
+ | |||
+ | *<math>f_Y(y) = \int^\infty_{-\infty} f_{XY}(x,y) dx</math> | ||
+ | |||
+ | *<math>E(g(x))=\int^\infty_{-\infty} g(x)f_X(x,y) dy</math> | ||
+ | ---- | ||
+ | [[Main_Page_ECE302Fall2008sanghavi|Back to ECE302 Fall 2008 Prof. Sanghavi]] |
Latest revision as of 12:05, 22 November 2011
Contents
- 1 ECE302 Cheat Sheet number 2
- 1.1 Cumulative Density Function (CDF)
- 1.2 Exponential RV
- 1.3 Gaussian RV
- 1.4 PDF Properties
- 1.5 Theorem of Total Probability for Continuous Random Variables
- 1.6 Conditioning a Random variable on an Event
- 1.7 Conditioning a Random variable on another Random variable
- 1.8 Shifting and Scaling of Random Variables
- 1.9 Addition of Continuous Random Variables
- 1.10 Addition of Discrete Random Variables
- 1.11 Continuous Bayes' rule:
- 1.12 Other Useful Things
ECE302 Cheat Sheet number 2
Cumulative Density Function (CDF)
- $ F_X(x) = P[X \leq x] = \int_{-\infty}^{\infty} f_x(t)dt $
- $ 1 - F_X(x) = P[X > x]\! $
$ \lim_{x\rightarrow-\infty}F_X(x) = 0 $
$ \lim_{x\rightarrow\infty}F_X(x) = 1 $
If X is discrete PX(k) = P(X<= k)-P(X<=k-1)
= FX(k)-FX(k-1)
Converting from CDF -> PDF :
fX(x) = d FX(x)/dt i.e. Take derivative of the CDF to get PDF
Exponential RV
- The first occurance of a very rare event, when trials happen very fast.
PDF: fX(x) = $ \lambda*e^{-\lambda*x} $, x >= 0 ; fX(x) = 0 , else
CDF: FX(x) = $ 1-e^{-\lambda*x} $
- E[X] = 1/$ \lambda $ , var(X) = 1/($ \lambda)^2 $
Gaussian RV
- The sum of many, small independent things
- Parameters:
$ E[X]=\mu\! $
$ Var[X]=\sigma^2\! $
$ f_X(x)=\frac{1}{\sqrt{2\pi\sigma^2}}e^{\frac{-(x-\mu)^2}{2\sigma^2}} $
For 2 independent Gaussians:
$ Z=X+Y\! $
$ E[Z]=\mu_X +\mu_Y\! $
$ Var(Z)=\sigma^2_X+\sigma^2_Y $
PDF Properties
- $ f_X(x)\geq 0 $ for all x
- $ \int\limits_{-\infty}^{\infty}f_X(x)dx = 1 $
- If $ \delta $ is very small, then
$ P([x,x+\delta]) \approx f_X(x)\cdot\delta $
- For any subset B of the real line,
$ P(X\in B) = \int\limits_Bf_X(x)dx $
- For Continuous Random Variable:
P(X > x) = $ \int\limits_{x}^{\infty}f_X(x)dx $ P(X <= x) = $ \int\limits_{-\infty}^{x}f_X(x)dx $
Theorem of Total Probability for Continuous Random Variables
- $ f_Y(y) = f_{Y|A}(y)P(A) + f_{Y|B}(y)P(B)\, $
- $ f_X(x) = \int^\infty_{-\infty}f_{XY}(x,y)dy = \int^\infty_{-\infty}f_{X|Y}(x|y)f_Y(y)dy \, $
- $ f_Y(y) = f_{Y|A1}(y)P(A1) + f_{Y|A2}(y)P(A2) + f_{Y|A3}(y)P(A3)+ ... + f_{Y|Ai}(y)P(Ai)\, $ if A1, A2, A3,... is disjoint
Conditioning a Random variable on an Event
$ f_{X|Y}(x)=P(X=x|A)=\frac {P({X=x}\cap A)}{P(A)} $
The events $ {X=x}\cap A $ are disjoint for different values of x, their union is A, and,therefore,
$ P(A)=\sum_xP({X=x}\cap A) $
$ \sum_xP_{x|A}(x)=1 $
Conditioning a Random variable on another Random variable
$ f_{X|Y}(x|y)=\dfrac {f_{XY}(x,y)}{f_{Y}(y)} $
Shifting and Scaling of Random Variables
Let $ Y=aX+b \, $
$ f_Y(y)=\dfrac{1}{|a|}f_Y(\dfrac{y-b}{a})\! $
- $ E[Y] = aE[X]+b \, $
- $ Var(X) = a^2 E[X^2] \, $
Addition of Continuous Random Variables
If X and Y are continuous and independent random variables and Z = X + Y then
- $ f_Z(z) = \int^\infty_{-\infty} f_X(x)f_Y(z-x) dx $
Addition of Discrete Random Variables
If X and Y are discrete and independent random variables and Z = X + Y then
$ f_Z(z) = \sum_X f_X(x)f_Y(z-x) $
Continuous Bayes' rule:
$ f_{X|Y}(x|y)=((f_X(x)).f_{Y|X}(y|x))/f_Y(y) $
Other Useful Things
If X and Y are indepdent of each other, then
- $ E[XY] = E[X]E[Y]\! $
- $ E[X] = \int^\infty_{-\infty}x*f_X(x)dx\! $
- $ Var(X) = E[X^2] - (E[X])^2\! $
Marginal Probability Density Functions:
- $ f_X(x) = \int^\infty_{-\infty} f_{XY}(x,y) dy $
- $ f_Y(y) = \int^\infty_{-\infty} f_{XY}(x,y) dx $
- $ E(g(x))=\int^\infty_{-\infty} g(x)f_X(x,y) dy $