(New page: Category:ECE600 Category:lecture notes <center><font size= 4> '''Random Variables and Signals''' </font size> <font size= 3> Topic 6: Random Variables: Distributions</font size> ...) |
|||
Line 67: | Line 67: | ||
&=F_X(x) | &=F_X(x) | ||
\end{align}</math></center> | \end{align}</math></center> | ||
+ | |||
+ | 4. <math>P(X>x) = 1-F_X(x)</math> for all x ∈ '''R''' | ||
+ | |||
+ | 5. If <math>x_1 < x_2</math>, then <br/> | ||
+ | <center><math> P(x_1<X\leq x_2) = F_X(x_2) - F_X(x_1)\;\forall x_1,x_2\in\mathbb R</math></center> | ||
+ | |||
+ | 6. <math>P(\{X=x\})= F_X(x) - F_X(x^-)</math>, where <br/> | ||
+ | <center><math> F)X(x^-) = \lim_{\epsilon\rightarrow 0,\epsilon>0} F_X(x-\epsilon)</math></center> | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | == The Probability Density Function == | ||
+ | |||
+ | '''Definition''' <math>\quad</math> The '''probability density function (pdf)''' of a random variable X is the derivative of the cdf of X, <br/> | ||
+ | <center><math> f_X(x) = \frac{dF_X(x)}{dx}</math></center> | ||
+ | at points where <math>F_x</math> is differentiable. <br/> | ||
+ | From the Fundamental Theorem of Calculus, we then have that <br/> | ||
+ | <center><math>F_X(x)=\int_{-\infty}^xf_X(r)dr\;\;\forall x\in\mathbb R</math></center> | ||
+ | |||
+ | '''Important note:''' the cdf <math>F_X</math> might not be differentiable everywhere. At points where <math>F_X</math> is not differentiable, we can use the Dirac delta function to defing <math>f_x</math>. | ||
+ | |||
+ | '''Definition''' <math>\quad</math> The '''Dirac Delta Function <math>\delta(x)</math>''' is the function satisfying the properties: <br/> | ||
+ | 1. <br/> | ||
+ | <center><math>\delta(x) = 0 \;\forall x\neq0</math></center> | ||
+ | 2. <br/> | ||
+ | <center><math>\int_{-\infty}^{\infty} \delta(x)dx = \int_{-\epsilon}^{\epsilon}\delta(x)dx = 1\;\forall\epsilon>0</math></center> | ||
+ | |||
+ | If <math>F_X</math> is not differentiable at a point, use <math>\delta(x)</math> at that point to represent <math>f_X</math>. | ||
+ | |||
+ | Why do we do this? Consider the step function <math>u(x)</math>, which is discontinuous and thus not differentiable at <math>x=0</math>. This is a common type of discontinuity we see in cdfs. The derivative of <math>u(x)</math> is defined as <br/> | ||
+ | <center><math>\frac{du(x)}{x}=\lim_{h\rightarrow 0}\frac{u(x+h)-u(x)}{h}</math></center> | ||
+ | This limit does not exist at <math>x=0</math> | ||
+ | |||
+ | Let's look at the function <br/> | ||
+ | <center><math>g(x) = \frac{u(x+h)-u(h)}{h}</math></center> | ||
+ | It looks like this: | ||
+ | |||
+ | <center>[[Image:fig1_rv_distributions.png|400px|thumb|left|Fig 1: g(x) for h>0]]</center> | ||
+ | |||
+ | For any x ≠ 0, we have that <br/> | ||
+ | <center><math>\frac{u(x+h)-u(x)}{h}=0</math></center> | ||
+ | for small enough h.<br/> | ||
+ | Also, ∀ <math>\epsilon</math><0, <br/> | ||
+ | <center><math> \int_{-\epsilon}^{\epsilon}\delta(x)dx= 1\; \forall h<\epsilon</math></center> | ||
+ | So, in the limit, the function g(x) has the properties of the <math>\delta</math>-function as h tends to 0. A similar argument can be made for h<0.<br/> | ||
+ | So this is why it is sometimes written that <br/> | ||
+ | <center><math> \frac{du(x)}{x} = \delta(x)</math></center> | ||
+ | |||
+ | Since we will only work with non-differentiable functions that have step discontinuities as cdfs, we write <br/> | ||
+ | <center><math>f_X(x) = \frac{dF_X(x)}{dx}</math></center> | ||
+ | with the understanding that <math>d/dx</math> is not necessarily the traditional definition of the derivative. | ||
+ | |||
+ | '''Properties of the pdf:'''<br/> | ||
+ | 1. (proof)<br/> | ||
+ | <center><math> f_X(x)\geq 0\;\forall x\in \mathbb R</math></center> <br/> | ||
+ | |||
+ | 2. (proof)<br/> | ||
+ | <center><math>\int_{-\infty}^{\infty}f_X(x)dx = 1</math></center><br/> | ||
+ | |||
+ | 3. (proof) if <math>x_1<x_2</math></center>, then<br/> | ||
+ | <center><math>P(x_1<X\leq x_2) = int_{-x_1}^{x_2}f_X(x)dx</math></center> | ||
+ | |||
+ | Some notes:<br/> | ||
+ | * We introduced the concept of a pdf in our discussion of probability spaces. We could have defined the pdf of a random variable X as a function <math>f_X</math> satisfying properties 1 and 2 above, and then define <math>F_X</math> in terms of <math>f_X</math>. | ||
+ | * f_X(x) is not a probability for a fixed x, it gives us instead the "probability density", so it must be integrated to give us the probability. | ||
+ | * In practice, to compute probabilities of random variable X, we normally use <br/> | ||
+ | <center><math> P(X\in A) = \int_{A}f_X(x)dx</math></center> | ||
+ | |||
+ | ---- | ||
+ | |||
+ | |||
Revision as of 07:22, 1 October 2013
Random Variables and Signals
Topic 6: Random Variables: Distributions
How do we find , compute and model P(x ∈ A) for a random variable X for all A ∈ B(R)? We use three different functions:
- the cumulative distribution function (cdf)
- the probability density function (pdf)
- the probability mass function (pmf)
We will discuss these in this order, although we could come at this discussion in a different way and a different order and arrive at the same place.
Definition $ \quad $ The cumulative distribution function (cdf) of X is defined as
Notation $ \quad $ Normally, we write this as
So $ F_X(x) $ tells us P$ P_X(A) $ if A = (-∞,x] for some real x.
What about other A ∈ B(R)? It can be shown that any A ∈ B(R) can be written as a countable sequence of set operations (unions, intersections, complements) on intervals of the form (-∞,x$ _n $], so can use the probability axioms to find $ P_X(A) $ from $ F_X $ for any A ∈ B(R). This is not how we do things in practice normally. This will be discussed more later.
Can an arbitrary function $ F_X, $ be a valid cdf? No, it cannot.
Properties of a valid cdf:
1.
This is because
and
2. For any $ x_1,x_2 $ ∈ R such that $ x_1<x_2 $,
i.e. $ F_X(x) $ is a non decreasing function.
3. $ F_X $ is continuous from the right , i.e.
Proof:
First, we need some results from analysis and measure theory:
(i) For a sequence of sets, $ A_1, A_2,... $, if $ A_1 $ ⊃ $ A_2 $ ⊃ ..., then
(ii) If $ A_1 $ ⊃ $ A_2 $ ⊃ ..., then
(iii) We can write $ F_X(x^+) $ as
Now let
Then
4. $ P(X>x) = 1-F_X(x) $ for all x ∈ R
5. If $ x_1 < x_2 $, then
6. $ P(\{X=x\})= F_X(x) - F_X(x^-) $, where
The Probability Density Function
Definition $ \quad $ The probability density function (pdf) of a random variable X is the derivative of the cdf of X,
at points where $ F_x $ is differentiable.
From the Fundamental Theorem of Calculus, we then have that
Important note: the cdf $ F_X $ might not be differentiable everywhere. At points where $ F_X $ is not differentiable, we can use the Dirac delta function to defing $ f_x $.
Definition $ \quad $ The Dirac Delta Function $ \delta(x) $ is the function satisfying the properties:
1.
2.
If $ F_X $ is not differentiable at a point, use $ \delta(x) $ at that point to represent $ f_X $.
Why do we do this? Consider the step function $ u(x) $, which is discontinuous and thus not differentiable at $ x=0 $. This is a common type of discontinuity we see in cdfs. The derivative of $ u(x) $ is defined as
This limit does not exist at $ x=0 $
Let's look at the function
It looks like this:
For any x ≠ 0, we have that
for small enough h.
Also, ∀ $ \epsilon $<0,
So, in the limit, the function g(x) has the properties of the $ \delta $-function as h tends to 0. A similar argument can be made for h<0.
So this is why it is sometimes written that
Since we will only work with non-differentiable functions that have step discontinuities as cdfs, we write
with the understanding that $ d/dx $ is not necessarily the traditional definition of the derivative.
Properties of the pdf:
1. (proof)
2. (proof)
3. (proof) if $ x_1<x_2 $</center>, then
Some notes:
- We introduced the concept of a pdf in our discussion of probability spaces. We could have defined the pdf of a random variable X as a function $ f_X $ satisfying properties 1 and 2 above, and then define $ F_X $ in terms of $ f_X $.
- f_X(x) is not a probability for a fixed x, it gives us instead the "probability density", so it must be integrated to give us the probability.
- In practice, to compute probabilities of random variable X, we normally use
References
- M. Comer. ECE 600. Class Lecture. Random Variables and Signals. Faculty of Electrical Engineering, Purdue University. Fall 2013.