(New page: It's really tough to choose one out of so many theorems. However, Bayes' theorem which I learned in my probability class is one of these that dazzles me. I especially like its alternative ...) |
|||
Line 3: | Line 3: | ||
<math>P(F|E) = \frac{P(E | F)\, P(F)}{P(E|F) P(F) + P(E|F^C) P(F^C)}. \!</math> | <math>P(F|E) = \frac{P(E | F)\, P(F)}{P(E|F) P(F) + P(E|F^C) P(F^C)}. \!</math> | ||
− | Here, E and F are events from sample space S: | + | Here, E and F are events from sample space S: P(F)!=0, P(E)!=0. P(F|E) is the conditional probability of F given E. P(E), P(F) are marginal probabilities of E and F respectively. P(F^C) is the complementary event of F. |
− | This theorem helped me a lot in programming competitions like TopCoder and I once solved the problem from past Amazon interviews applying it. Click [http:// | + | This theorem helped me a lot in programming competitions like TopCoder and I once solved the problem from past Amazon interviews applying it. Click [http://en.wikipedia.org/wiki/Bayes%27_theorem:here] for more details. |
Revision as of 07:25, 31 August 2008
It's really tough to choose one out of so many theorems. However, Bayes' theorem which I learned in my probability class is one of these that dazzles me. I especially like its alternative form:
$ P(F|E) = \frac{P(E | F)\, P(F)}{P(E|F) P(F) + P(E|F^C) P(F^C)}. \! $
Here, E and F are events from sample space S: P(F)!=0, P(E)!=0. P(F|E) is the conditional probability of F given E. P(E), P(F) are marginal probabilities of E and F respectively. P(F^C) is the complementary event of F.
This theorem helped me a lot in programming competitions like TopCoder and I once solved the problem from past Amazon interviews applying it. Click [1] for more details.