m
m
Line 12: Line 12:
 
=Questions and Comments=
 
=Questions and Comments=
  
* Has anyone reserved this slecture for a review? If not, then I would like to review it -[[User:Mhossain|Maliha Hossain]]
+
* [[User:Mhossain|MH review]]: This slecture covers the derivation of Bayes' theorem for random variables. Bayes' theorem is then illustrated using an example. I think the slecture is useful if the reader would like to brush up on Bayes' Theorem and conditional distribution as the slecture is very concise and succinct but if the reader is interested in a more rigorous derivation based on the foundations of probability theory, then he/she should probably look elsewhere.  
 
+
 
+
* MH review: This slecture covers the derivation of Bayes' theorem for random variables. Bayes' theorem is then illustrated using an example. I think the slecture is useful if the reader would like to brush up on Bayes' Theorem and conditional distribution as the slecture is very concise and succinct but if the reader is interested in a more rigorous derivation based on the foundations of probability theory, then he/she should probably look elsewhere.  
+
  
 
* MH comment: in the "Bayes Rule Statement" section, I take that you are defining Bayes' theorem for discrete and random variables but I think it's good practice to explicitly state what your random variables are, that is for example you could say "for discrete random variables X and Y..." I think it is also good practice to formally define your function P. Generally we say X has probability mass function P<math>_X(x)</math>. The subscript is important because it helps differentiate from the density function of Y, P<math>_Y(y)</math>. The way you have presented it here appears that <math>x</math> and <math>y</math> are two different arguments to the same function when we know that this may not necessarily be the case in general.
 
* MH comment: in the "Bayes Rule Statement" section, I take that you are defining Bayes' theorem for discrete and random variables but I think it's good practice to explicitly state what your random variables are, that is for example you could say "for discrete random variables X and Y..." I think it is also good practice to formally define your function P. Generally we say X has probability mass function P<math>_X(x)</math>. The subscript is important because it helps differentiate from the density function of Y, P<math>_Y(y)</math>. The way you have presented it here appears that <math>x</math> and <math>y</math> are two different arguments to the same function when we know that this may not necessarily be the case in general.

Revision as of 22:37, 21 May 2014

Questions and Comments for: Derivation of Bayes Rule

A slecture by Anonymous7


Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.


Questions and Comments

  • MH review: This slecture covers the derivation of Bayes' theorem for random variables. Bayes' theorem is then illustrated using an example. I think the slecture is useful if the reader would like to brush up on Bayes' Theorem and conditional distribution as the slecture is very concise and succinct but if the reader is interested in a more rigorous derivation based on the foundations of probability theory, then he/she should probably look elsewhere.
  • MH comment: in the "Bayes Rule Statement" section, I take that you are defining Bayes' theorem for discrete and random variables but I think it's good practice to explicitly state what your random variables are, that is for example you could say "for discrete random variables X and Y..." I think it is also good practice to formally define your function P. Generally we say X has probability mass function P$ _X(x) $. The subscript is important because it helps differentiate from the density function of Y, P$ _Y(y) $. The way you have presented it here appears that $ x $ and $ y $ are two different arguments to the same function when we know that this may not necessarily be the case in general.
  • MH comment: In subsequent sections, the equations might be clearer if you insert some white space between the end of an equation and the equation number. Consider using the \quad command in math mode.
  • MH comment: It seems that you have forgotten to place P(X\cap y) in math mode when you say, "Now because the intersection is commutative, we can write the P(x\cap y) as." Also in the the heading for the next section, did you mean to say continuous rather than continues? This type-o appears again in the sentence preceding equation 11.
  • MH comment: Regarding equation 6, you say that "we know this" and we often take equation 6 for granted but where this assumption comes from is not so trivial. Formally, Bayes' Theorem is defined using the probability of events. But in the continuous case, $ f_X(x) $ is not really the probability that X = $ x $, which is zero. I think it's important to be aware of these nuances if you are doing a formal derivation. There is a lot of good information on this in Professor Mary Comer's ECE 600 notes and I am leaving the link here in case you are interested.
  • MH comment: There is a small type-o where you say, "but because $ f_{Y,X}(y,x) $ is the ... as follows." It seems there is a space missing where you say "$ f_{X,Y}(y,x) $,i.e." And after equation 8, I suppose you meant to say by instead of be in "Now, be arranging..." Also in the example when you say, "Let suppose," did you mean "Let's" or "Let us?"
  • MH comment: What does the first box in figure 1 represent? Is it the entire event space? What is the significance of the observation that P(A|B)>P(A)? Although the example you have provided illustrates Bayes' theorem nicely, since in the statement section, you start out by defining the theorem for random variables, I was expecting an example related to random variables: continuous, discrete or a mixture of both. The example might be more relevant if you explicitly define events A and B as Bernoulli random variables.
  • MH comment: Although in general people use Bayes' theorem and Bayes' rule interchangeably, Professor Boutin refers to Bayes' decision rule (for classifiers) as Bayes' rule and Bayes' Theorem is the equation for reversing priors. Although calling it Bayes' rule is not wrong, in the context of this class (CS/ECE 662), it is not the notation we are using. Finally, I wanted to point out that the term is formally written as Bayes' instead of Bayes or Baye's.




  • Reviewed by Yanzhe Cui: This slecture is about the concept of Bayes rule. The author introduced the derivation of Bayes' rule in discrete and continuous cases. Then the author gave a student attend seminar example and tried to use this example to show the advantage of using Bayes rule. The comments are: (1) please add some spaces between equation and equation number. They are confused when I first look at it; (2) what is the black box in decision tree figure (the left-most one)? if it's hard to show the conditional probability using tree format, you could choose other ways, such as just text; (3) what do you want to show in Venn diagram? Please add labels in two circles using A and B; (4) the author claim that
    We can see now that $ \textbf{ P}(A|B) > \textbf{P}(A) $
    , but what's your point? You meant that, in any case, $ \textbf{ P}(A|B) > \textbf{P}(A) $ are satisfied? It would be better to explain it thoroughly; (5) if the author could provide a continuous example as well, in my opinion, it would be better and complete. Overall, the author did some research about Bayes rule and tried best to make Bayes rule easy to follow.
  • Additional Questions / Comments



Back to Derivation of Bayes Rule

Alumni Liaison

Sees the importance of signal filtering in medical imaging

Dhruv Lamba, BSEE2010