Line 55: Line 55:
  
 
'''Joint PMFs of more than one random variable'''
 
'''Joint PMFs of more than one random variable'''
 +
 
PX(x)=(SUM of all y)[PXY(x,y)]
 
PX(x)=(SUM of all y)[PXY(x,y)]
 +
 
PY(y)=(SUM of all x)[PXY(x,y)]
 
PY(y)=(SUM of all x)[PXY(x,y)]

Revision as of 09:46, 26 September 2008

You can get/put ideas for what should be on the cheat sheet here. DO NOT SIGN YOUR NAME

Sample Space, Axioms of probability (finite spaces, infinite spaces)

$ P(A) \geq 0 $ for all events A

Properties of Probability laws


Definition of conditional probability, and properties thereof

$ P(A|B) = \frac{P(A \cap B)}{P(B)} $

Properties:

1) $ P(A|B) \ge 0 $

2) $ P( \Omega |B) = 1\! $

3) if A1 and A2 are disjoint $ P(A1 \cup A2|B) = P(A1|B) + P(A2|B) $

Bayes rule and total probability

$ P(A|B) = \frac{P(A \cap B)}{P(B)} $

I am not so sure about this formula being Bayes rule! But please correct me if i am wrong

Definitions of Independence and Conditional independence


Definition and basic concepts of random variables, PMFs


The common random variables: bernoulli, binomial, geometric, and how they come about in problems. AlSo their PMFs.

Geometric RV

P(X=k) = (1-p)^(k-1) * p for k>=1

$ E[X] = 1/p \! $


Binomial R.V.

P(X=k) = nCk * p^k * (1-p)^(n-k) for k=0,1,2,...n

E[X]=np VAR[X]=(1-p)/p^2


Definition of expectation and variance and their properties

$ Var(X) = E[X^2] - (E[X])^2 \! $


Joint PMFs of more than one random variable

PX(x)=(SUM of all y)[PXY(x,y)]

PY(y)=(SUM of all x)[PXY(x,y)]

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva