Revision as of 11:58, 22 November 2011 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Instructions

Homework 10 can be downloaded here on the ECE 302 course website.

Problem 1: Random Point, Revisited

In the following problems, the random point (X , Y) is uniformly distributed on the shaded region shown.

UniformArea ECE302Fall2008sanghavi.jpg

  • (a) Find the marginal pdf $ f_X(x) $ of the random variable $ X $. Find $ E[X] $ and $ Var(X) $.
  • (b) Using your answer from part (a), find the marginal pdf $ f_Y(y) $ of the random variable $ Y $, and its mean and variance, $ E[Y] $, and $ Var[Y] $.
  • (c) Find $ f_{Y|X}(y|\alpha) $, the conditional pdf of $ Y $ given that $ X = \alpha $, where $ 0 < \alpha < 1/2 $. Then find the conditional mean and conditional variance of $ Y $ given that $ X = \alpha $.
  • (d) What is the MMSE estimator, $ \hat{y}_{\rm MMSE}(x) $?
  • (e) What is the Linear MMSE estimator, $ \hat{y}_{\rm LMMSE}(x) $?

10.1 joon young kim_ECE302Fall2008sanghavi

10.1 Jayanth Athreya_ECE302Fall2008sanghavi

10.1 (a)&(b) Suan-Aik Yeo_ECE302Fall2008sanghavi (Question by Jonathan, comment by Gregory Pajot)

10.1 (c) Arie Lyles_ECE302Fall2008sanghavi (additional commentary by Brian Thomas)

10.1 (d) Kristin Wing_ECE302Fall2008sanghavi (edited by Nicholas Browdues)

10.1 (a), a much easier way to find fx(x), Chris Rush_ECE302Fall2008sanghavi

10.1 (e) Spencer Mitchell_ECE302Fall2008sanghavi (question from Nicholas; response by Brian Thomas)


10.1 (a) Christopher Wacnik_ECE302Fall2008sanghavi

Problem 2: Variable Dependency

Suppose that $ X $ and $ Y $ are zero-mean jointly Gaussian random variables with variances $ \sigma_X^2 $ and $ \sigma_Y^2 $, respectively and correlation coefficient $ \rho $.

  • (a) Find the means and variances of the random variables $ Z = X\cos\theta + Y\sin\theta $ and $ W = Y\cos\theta - X sin\theta $.
  • (b) What is $ Cov(Z,W) $?
  • (c) Find an angle $ \theta $ such that $ Z $ and $ W $ are independent Gaussian random variables. You may express your answer as a trigonometric function involving $ \sigma_X^2 $, $ \sigma_Y^2 $, and $ \rho $. In particular,what is the value of $ \theta $ if $ \sigma_X = \sigma_Y $?

10.2 (a&b) Beau Morrison_ECE302Fall2008sanghavi

10.2 Tiffany Sukwanto_ECE302Fall2008sanghavi

10.2 Josh Long_ECE302Fall2008sanghavi

10.2 Monsu Mathew_ECE302Fall2008sanghavi

Problem 3: Noisy Measurement

Let $ X = Y+N $, where $ Y $ is exponentially distributed with parameter $ \lambda $ and $ N $ is Gaussian with mean 0 and variance $ \sigma^2 $. The variables $ Y $ and $ N $ are independent, and the parameters $ \lambda $ and $ \sigma^2 $ are strictly positive (Recall that $ E[Y] = \frac1\lambda $ and $ Var(Y) = \frac{1}{\lambda^2} $.)

Find $ \hat{Y}_{\rm LMMSE}(X) $, the linear minimum mean square error estimator of $ Y $ from $ X $.

Hamad Al Shehhi 10.3_ECE302Fall2008sanghavi

Joe Gutierrez 10.3_ECE302Fall2008sanghavi

Jaewoo Choi 10.3_ECE302Fall2008sanghavi

Problem 4: Digital Loss

Let $ X $ be a continuous uniform random variable on the interval $ [0,1] $. It is quantized using an $ n $-level quantizer defined as follows: Given an input $ x $, the quantizer outputs a value $ q(x) = \frac1n\lfloor nx\rfloor $; that is, it rounds $ x $ down to the nearest multiple of $ 1/n $. (Note that for any real number $ a $, $ \lfloor a\rfloor $ is the largest integer less than or equal to $ a $). Thus the output of the quantizer is always a value from the set $ \{0,\frac1n,\frac2n,\ldots,\frac{n-1}n\} $. Find the mean-square-error $ E[(X-q(X))^2] $.

Anand Gautam 10.4_ECE302Fall2008sanghavi

Shao-Fu Shih 10.4_ECE302Fall2008sanghavi

Justin Mauck 10.4_ECE302Fall2008sanghavi


Back to ECE302 Fall 2008 Prof. Sanghavi

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang