(→Problem 2: Variable Dependency) |
(→Problem 4: Digital Loss) |
||
Line 25: | Line 25: | ||
== Problem 4: Digital Loss== | == Problem 4: Digital Loss== | ||
+ | Let <math>X</math> be a continuous uniform random variable on the interval <math>[0,1]</math>. It is quantized using an <math>n</math>-level quantizer defined as follows: Given an input <math>x</math>, the quantizer outputs a value <math>q(x) = \frac1n\lfloor nx\rfloor</math>; that is, it rounds <math>x</math> down to the nearest multiple of <math>1/n</math>. (Note that for any real number <math>a</math>, <math>\lfloor a\rfloor</math> is the largest integer less than or equal to <math>a</math>). Thus the output of the quantizer is always a value from the set <math>\{0,\frac1n,\frac2n,\ldots,\frac{n-1}n\}</math>. Find the mean-square-error <math>E[(X-q(X))^2]</math>. |
Revision as of 07:10, 2 December 2008
Contents
Instructions
Homework 10 can be downloaded here on the ECE 302 course website.
Problem 1: Random Point, Revisited
In the following problems, the random point (X , Y) is uniformly distributed on the shaded region shown.
@@@insert figure@@@
- (a) Find the marginal pdf $ f_X(x) $ of the random variable $ X $. Find $ E[X] $ and $ Var(X) $.
- (b) Using your answer from part (a), find the marginal pdf $ f_Y(y) $ of the random variable $ Y $, and its mean and variance, $ E[Y] $, and $ Var[Y] $.
- (c) Find $ f_{Y|X}(y|\alpha) $, the conditional pdf of $ Y $ given that $ X = \alpha $, where $ 0 < \alpha < 1/2 $. Then find the conditional mean and conditional variance of $ Y $ given that $ X = \alpha $.
- (d) What is the MMSE estimator, $ \hat{y}_{\rm MMSE}(x) $?
- (e) What is the Linear MMSE estimator, $ \hat{y}_{\rm LMMSE}(x) $?
Problem 2: Variable Dependency
Suppose that $ X $ and $ Y $ are zero-mean jointly Gaussian random variables with variances $ \sigma_X^2 $ and $ \sigma_Y^2 $, respectively and correlation coefficient $ \rho $.
- (a) Find the means and variances of the random variables $ Z = X\cos\theta + Y\sin\theta $ and $ W = Y\cos\theta - X sin\theta $.
- (b) What is $ Cov(Z,W) $?
- (c) Find an angle $ \theta $ such that $ Z $ and $ W $ are independent Gaussian random variables. You may express your answer as a trigonometric function involving $ \sigma_X^2 $, $ \sigma_Y^2 $, and $ \rho $. In particular,what is the value of $ \theta $ if $ \sigma_X = \sigma_Y $?
Problem 3: Noisy Measurement
Let $ X = Y+N $, where $ Y $ is exponentially distributed with parameter $ \lambda $ and $ N $ is Gaussian with mean 0 and variance $ \sigma^2 $. The variables $ Y $ and $ N $ are independent, and the parameters $ \lambda $ and $ \sigma^2 $ are strictly positive (Recall that $ E[Y] = \frac1\lambda $ and $ Var(Y) = \frac{1}{\lambda^2} $.)
Find $ \hat{Y}_{\rm LMMSE}(X) $, the linear minimum mean square error estimator of $ Y $ from $ X $.
Problem 4: Digital Loss
Let $ X $ be a continuous uniform random variable on the interval $ [0,1] $. It is quantized using an $ n $-level quantizer defined as follows: Given an input $ x $, the quantizer outputs a value $ q(x) = \frac1n\lfloor nx\rfloor $; that is, it rounds $ x $ down to the nearest multiple of $ 1/n $. (Note that for any real number $ a $, $ \lfloor a\rfloor $ is the largest integer less than or equal to $ a $). Thus the output of the quantizer is always a value from the set $ \{0,\frac1n,\frac2n,\ldots,\frac{n-1}n\} $. Find the mean-square-error $ E[(X-q(X))^2] $.