Line 39: | Line 39: | ||
P(X=k) = (1-p)^(k-1) * p for k>=1 | P(X=k) = (1-p)^(k-1) * p for k>=1 | ||
− | <math> E[X] = 1/p </math> | + | <math> E[X] = 1/p \!</math> |
Line 51: | Line 51: | ||
'''Definition of expectation and variance''' and their properties | '''Definition of expectation and variance''' and their properties | ||
− | <math> Var(X) = E[X^2] - (E[X])^2 </math> | + | <math> Var(X) = E[X^2] - (E[X])^2 \!</math> |
'''Joint PMFs of more than one random variable''' | '''Joint PMFs of more than one random variable''' |
Revision as of 19:01, 25 September 2008
You can get/put ideas for what should be on the cheat sheet here. DO NOT SIGN YOUR NAME
Sample Space, Axioms of probability (finite spaces, infinite spaces)
$ P(A) \geq 0 $ for all events A
Properties of Probability laws
Definition of conditional probability, and properties thereof
$ P(A|B) = \frac{P(A \cap B)}{P(B)} $
Properties:
1) $ P(A|B) \ge 0 $
2) $ P( \Omega |B) = 1\! $
3) if A1 and A2 are disjoint $ P(A1 \cup A2|B) = P(A1|B) + P(A2|B) $
Bayes rule and total probability
$ P(A|B) = \frac{P(A \cap B)}{P(B)} $
I am not so sure about this formula being Bayes rule! But please correct me if i am wrong
Definitions of Independence and Conditional independence
Definition and basic concepts of random variables, PMFs
The common random variables: bernoulli, binomial, geometric, and how they come about in problems. AlSo
their PMFs.
Geometric RV
P(X=k) = (1-p)^(k-1) * p for k>=1
$ E[X] = 1/p \! $
Binomial R.V.
P(X=k) = nCk * p^k * (1-p)^(n-k) for k=0,1,2,...n
E[X]=np VAR[X]=(1-p)/p^2
Definition of expectation and variance and their properties
$ Var(X) = E[X^2] - (E[X])^2 \! $
Joint PMFs of more than one random variable