Line 1: Line 1:
 +
==7.13 QE 2007 August==
  
 +
'''1. (25 Points)'''
 +
 +
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two independent identically distributed random variables taking on values in <math>\mathbf{N}</math>  (the natural numbers) with <math>P\left(\left\{ \mathbf{X}=i\right\} \right)=P\left(\left\{ \mathbf{Y}=i\right\} \right)=\frac{1}{2^{i}}\;,\qquad i=1,2,3,\cdots.</math>
 +
 +
'''(a)'''
 +
 +
Find <math>P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)=k\right\} \right)</math> , for <math>k\in\mathbf{N}</math> .
 +
 +
'''Note'''
 +
 +
This problem is different from <math>P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)>k\right\} \right)</math> .
 +
 +
<math>P\left(\left\{ \mathbf{Y}>k\right\} \right)=1-P\left(\left\{ \mathbf{Y}\leq k\right\} \right)=1-\sum_{i=1}^{k}\frac{1}{2^{i}}=1-\frac{\frac{1}{2}\left(1-\left(\frac{1}{2}\right)^{k}\right)}{1-\frac{1}{2}}=1-\left(1-\left(\frac{1}{2}\right)^{k}\right)=\left(\frac{1}{2}\right)^{k}.</math>
 +
 +
<math>P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)=k\right\} \right)=P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}>k\right\} \right)+P\left(\left\{ \mathbf{X}>k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right)+P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right)</math><math>=2\cdot P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}>k\right\} \right)+P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}=k\right\} \right)</math><math>=2\cdot\left(\frac{1}{2}\right)^{k}\cdot\left(\frac{1}{2}\right)^{k}+\left(\frac{1}{2}\right)^{k}\cdot\left(\frac{1}{2}\right)^{k}=3\cdot\left(\frac{1}{2}\right)^{2k}=\frac{3}{4^{k}}.</math>
 +
 +
'''(b)'''
 +
 +
Find <math>P\left(\left\{ \mathbf{X}=\mathbf{Y}\right\} \right)</math> .
 +
 +
<math>P\left(\left\{ \mathbf{X}=\mathbf{Y}\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}=k\right\} \right)</math><math>=\sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^{k}\left(\frac{1}{2}\right)^{k}=\sum_{k=1}^{\infty}\left(\frac{1}{4}\right)^{k}=\frac{\frac{1}{4}}{1-\frac{1}{4}}=\frac{1}{3}.</math>
 +
 +
'''(c)'''
 +
 +
Find <math>P\left(\left\{ \mathbf{Y}>\mathbf{X}\right\} \right)</math> .
 +
 +
<math>P\left(\left\{ \mathbf{Y}>\mathbf{X}\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{Y}>k\right\} \cap\left\{ \mathbf{X}=k\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{Y}>k\right\} \right)\cdot P\left(\left\{ \mathbf{X}=k\right\} \right)</math><math>=\sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^{k}\left(\frac{1}{2}\right)^{k}=\sum_{k=1}^{\infty}\left(\frac{1}{4}\right)^{k}=\frac{\frac{1}{4}}{1-\frac{1}{4}}=\frac{1}{3}.</math>
 +
 +
'''(d)'''
 +
 +
Find <math>P\left(\left\{ \mathbf{Y}=k\mathbf{X}\right\} \right)</math>  for a given natural number <math>k</math> .
 +
 +
<math>P\left(\left\{ \mathbf{Y}=k\mathbf{X}\right\} \right)=\sum_{i=1}^{\infty}P\left(\left\{ \mathbf{Y}=i\right\} \cap\left\{ \mathbf{X}=ki\right\} \right)=\sum_{i=1}^{\infty}P\left(\left\{ \mathbf{Y}=i\right\} \right)\cdot P\left(\left\{ \mathbf{X}=ki\right\} \right)</math><math>=\sum_{i=1}^{\infty}\frac{1}{2^{i}}\cdot\frac{1}{2^{ki}}=\sum_{i=1}^{\infty}\left(\frac{1}{2^{k+1}}\right)^{i}=\frac{\frac{1}{2^{k+1}}}{1-\frac{1}{2^{k+1}}}=\frac{1}{2^{k+1}-1}.</math>
 +
 +
'''2. (25 Points)'''
 +
 +
Let <math>\left\{ \mathbf{X}_{n}\right\} _{n\geq1}</math>  be a sequence of binomially distributed random variables, with the <math>n</math> -th random variable <math>\mathbf{X}_{n}</math>  having pmf <math>p_{\mathbf{X}_{n}}\left(k\right)=P\left(\left\{ \mathbf{X}_{n}=k\right\} \right)=\left(\begin{array}{c}
 +
n\\
 +
k
 +
\end{array}\right)p_{n}^{k}\left(1-p_{n}\right)^{n-k}\;,\qquad k=0,\cdots,n,\quad p_{n}\in\left(0,1\right).</math>
 +
 +
Show that, if the <math>p_{n}</math>  have the property that <math>np_{n}\rightarrow\lambda</math>  as <math>n\rightarrow\infty</math> , where <math>\lambda</math>  is a positive constant, then the sequence <math>\left\{ \mathbf{X}_{n}\right\} _{n\geq1}</math>  converges in distribution to a Poisson random variable <math>\mathbf{X}</math>  with mean <math>\lambda</math> .
 +
 +
'''Hint:'''
 +
 +
You may find the following fact useful:
 +
 +
<math>\lim_{n\rightarrow\infty}\left(1+\frac{x}{n}\right)^{n}=e^{x}.</math>
 +
 +
'''Answer'''
 +
 +
Please see the example [CS1SequenceOfBinomiallyDistributedRV] that is identical to this problem.
 +
 +
'''3. (25 Points)'''
 +
 +
Let <math>\mathbf{X}\left(t\right)</math>  be a real Gaussian random process with mean function <math>\mu\left(t\right)</math>  and autocovariance function <math>C_{\mathbf{XX}}\left(t_{1},t_{2}\right)</math> .
 +
 +
'''(a)'''
 +
 +
Write the expression for the <math>n</math> -th order characteristic function of <math>\mathbf{X}\left(t\right)</math>  in terms of <math>\mu\left(t\right)</math>  and <math>C_{\mathbf{XX}}\left(t_{1},t_{2}\right)</math> .
 +
 +
ref.
 +
 +
There are the note about the n-th order characteristic function of Gaussians random process [CS1n-thOrderCharacteristicFunctionOfGaussian]. The only difference between the note and this problem is that this problem use the <math>\mu\left(t\right)</math>  rather than <math>\eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right]</math> .
 +
 +
'''Solution'''
 +
 +
<math>\Phi_{\mathbf{X}\left(t_{1}\right)\cdots\mathbf{X}\left(t_{n}\right)}\left(\omega_{1},\cdots,\omega_{n}\right)=\exp\left\{ i\sum_{k=1}^{n}\mu_{\mathbf{X}}\left(t_{k}\right)\omega_{k}-\frac{1}{2}\sum_{j=1}^{n}\sum_{k=1}^{n}C_{\mathbf{XX}}\left(t_{j},t_{k}\right)\omega_{j}\omega_{k}\right\}</math> .
 +
 +
'''(b)'''
 +
 +
Show that the probabilistic description of <math>\mathbf{X}\left(t\right)</math>  is completely characterized by <math>\mu\left(t\right)</math>  and autocovariance function <math>C_{\mathbf{XX}}\left(t_{1},t_{2}\right)</math> .
 +
 +
'''Solution'''
 +
 +
From (a), the characteristic function of <math>\mathbf{X}\left(t\right)</math>  is specified completely in terms of <math>\mu_{\mathbf{X}}\left(t\right)</math>  and <math>C_{\mathbf{XX}}\left(t_{1},t_{2}\right)</math> . Thus, probabilistic description of <math>\mathbf{X}\left(t\right)</math>  is completely characterized by the characteristic function.
 +
 +
'''Note'''
 +
 +
<math>f_{\mathbf{X}}\left(x\right)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{-i\omega x}\Phi_{\mathbf{X}}\left(\omega\right)d\omega.</math>
 +
 +
'''(c)'''
 +
 +
Show that if <math>\mathbf{X}\left(t\right)</math>  is wide-sense stationary then it is also strict-sense stationary.
 +
 +
'''Note'''
 +
 +
You can use the theorem and its proof [CS1WSSandGaussianisSSS] for solving this problem.
 +
 +
'''4. (25 Points)'''
 +
 +
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\mathbf{X}_{3},\cdots</math>  be a sequence of independent, identically distributed random variables, each having Cauchy pdf <math>f\left(x\right)=\frac{1}{\pi\left(1+x^{2}\right)}\;,\qquad-\infty<x<\infty. Let \mathbf{Y}_{n}=\frac{1}{n}\sum_{i=1}^{n}\mathbf{X}_{i}.</math> Find the pdf of <math>\mathbf{Y}_{n}</math> . Describe how the pdf of <math>\mathbf{Y}_{n}</math>  depends on <math>n</math> . Does the sequence <math>\mathbf{Y}_{1},\mathbf{Y}_{2},\mathbf{Y}_{3},\cdots</math>  converge in distribution? If yes, what is the distribution of the random variable it converges to?
 +
 +
'''Note'''
 +
 +
You can see the definition of the converge in distribution [CS1ConvergeInDistribution]. Furthermore, you have to know the characteristic function of Cauchy distributed random varaible.
 +
 +
'''Solution'''
 +
 +
According to the characteristic function of Cauchy distributed random variable,
 +
 +
<math>\Phi_{\mathbf{X}}\left(\omega\right)=e^{-\left|\omega\right|}.</math>
 +
 +
<math>\Phi_{\mathbf{Y}_{n}}\left(\omega\right)=E\left[\exp\left\{ i\omega\mathbf{Y}_{n}\right\} \right]=E\left[\exp\left\{ i\frac{\omega}{n}\sum_{k=1}^{n}\mathbf{X}_{k}\right\} \right]=E\left[\prod_{k=1}^{n}\exp\left\{ i\frac{\omega}{n}\mathbf{X}_{k}\right\} \right]</math><math>=E\left[\exp\left\{ i\frac{\omega}{n}\mathbf{X}\right\} \right]^{n}=\Phi_{\mathbf{X}}\left(\frac{\omega}{n}\right)^{n}=\left[e^{-\left|\omega/n\right|}\right]^{n}=e^{-\left|\omega\right|}.</math>
 +
 +
<math>f_{\mathbf{Y}_{n}}\left(\omega\right)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{-i\omega y}e^{-\left|\omega\right|}d\omega=\frac{1}{2\pi}\left[\int_{-\infty}^{0}e^{-i\omega y}e^{\omega}d\omega+\int_{0}^{\infty}e^{-i\omega y}e^{-\omega}d\omega\right]</math><math>=\frac{1}{2\pi}\left[\int_{-\infty}^{C}e^{\omega\left(1-iy\right)}+\int_{C}^{\infty}e^{-\omega\left(1+iy\right)}d\omega\right]=\frac{1}{2\pi}\left[\frac{1}{1-iy}e^{\omega\left(1-iy\right)}\biggl|_{-\infty}^{C}+\frac{-1}{1+iy}e^{-\omega\left(1+iy\right)}\biggl|_{C}^{\infty}\right]</math><math>=\frac{1}{2\pi}\left[\frac{1}{1-iy}+\frac{1}{1+iy}\right]=\frac{1}{2\pi}\left[\frac{1+iy+1-iy}{1+y^{2}}\right]=\frac{1}{2\pi}\cdot\frac{2}{1+y^{2}}=\frac{1}{\pi\left(1+y^{2}\right)}.</math>
  
 
----
 
----

Revision as of 10:39, 23 November 2010

7.13 QE 2007 August

1. (25 Points)

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two independent identically distributed random variables taking on values in $ \mathbf{N} $ (the natural numbers) with $ P\left(\left\{ \mathbf{X}=i\right\} \right)=P\left(\left\{ \mathbf{Y}=i\right\} \right)=\frac{1}{2^{i}}\;,\qquad i=1,2,3,\cdots. $

(a)

Find $ P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)=k\right\} \right) $ , for $ k\in\mathbf{N} $ .

Note

This problem is different from $ P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)>k\right\} \right) $ .

$ P\left(\left\{ \mathbf{Y}>k\right\} \right)=1-P\left(\left\{ \mathbf{Y}\leq k\right\} \right)=1-\sum_{i=1}^{k}\frac{1}{2^{i}}=1-\frac{\frac{1}{2}\left(1-\left(\frac{1}{2}\right)^{k}\right)}{1-\frac{1}{2}}=1-\left(1-\left(\frac{1}{2}\right)^{k}\right)=\left(\frac{1}{2}\right)^{k}. $

$ P\left(\left\{ \min\left(\mathbf{X},\mathbf{Y}\right)=k\right\} \right)=P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}>k\right\} \right)+P\left(\left\{ \mathbf{X}>k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right)+P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right) $$ =2\cdot P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}>k\right\} \right)+P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}=k\right\} \right) $$ =2\cdot\left(\frac{1}{2}\right)^{k}\cdot\left(\frac{1}{2}\right)^{k}+\left(\frac{1}{2}\right)^{k}\cdot\left(\frac{1}{2}\right)^{k}=3\cdot\left(\frac{1}{2}\right)^{2k}=\frac{3}{4^{k}}. $

(b)

Find $ P\left(\left\{ \mathbf{X}=\mathbf{Y}\right\} \right) $ .

$ P\left(\left\{ \mathbf{X}=\mathbf{Y}\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}=k\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{X}=k\right\} \right)\cdot P\left(\left\{ \mathbf{Y}=k\right\} \right) $$ =\sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^{k}\left(\frac{1}{2}\right)^{k}=\sum_{k=1}^{\infty}\left(\frac{1}{4}\right)^{k}=\frac{\frac{1}{4}}{1-\frac{1}{4}}=\frac{1}{3}. $

(c)

Find $ P\left(\left\{ \mathbf{Y}>\mathbf{X}\right\} \right) $ .

$ P\left(\left\{ \mathbf{Y}>\mathbf{X}\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{Y}>k\right\} \cap\left\{ \mathbf{X}=k\right\} \right)=\sum_{k=1}^{\infty}P\left(\left\{ \mathbf{Y}>k\right\} \right)\cdot P\left(\left\{ \mathbf{X}=k\right\} \right) $$ =\sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^{k}\left(\frac{1}{2}\right)^{k}=\sum_{k=1}^{\infty}\left(\frac{1}{4}\right)^{k}=\frac{\frac{1}{4}}{1-\frac{1}{4}}=\frac{1}{3}. $

(d)

Find $ P\left(\left\{ \mathbf{Y}=k\mathbf{X}\right\} \right) $ for a given natural number $ k $ .

$ P\left(\left\{ \mathbf{Y}=k\mathbf{X}\right\} \right)=\sum_{i=1}^{\infty}P\left(\left\{ \mathbf{Y}=i\right\} \cap\left\{ \mathbf{X}=ki\right\} \right)=\sum_{i=1}^{\infty}P\left(\left\{ \mathbf{Y}=i\right\} \right)\cdot P\left(\left\{ \mathbf{X}=ki\right\} \right) $$ =\sum_{i=1}^{\infty}\frac{1}{2^{i}}\cdot\frac{1}{2^{ki}}=\sum_{i=1}^{\infty}\left(\frac{1}{2^{k+1}}\right)^{i}=\frac{\frac{1}{2^{k+1}}}{1-\frac{1}{2^{k+1}}}=\frac{1}{2^{k+1}-1}. $

2. (25 Points)

Let $ \left\{ \mathbf{X}_{n}\right\} _{n\geq1} $ be a sequence of binomially distributed random variables, with the $ n $ -th random variable $ \mathbf{X}_{n} $ having pmf $ p_{\mathbf{X}_{n}}\left(k\right)=P\left(\left\{ \mathbf{X}_{n}=k\right\} \right)=\left(\begin{array}{c} n\\ k \end{array}\right)p_{n}^{k}\left(1-p_{n}\right)^{n-k}\;,\qquad k=0,\cdots,n,\quad p_{n}\in\left(0,1\right). $

Show that, if the $ p_{n} $ have the property that $ np_{n}\rightarrow\lambda $ as $ n\rightarrow\infty $ , where $ \lambda $ is a positive constant, then the sequence $ \left\{ \mathbf{X}_{n}\right\} _{n\geq1} $ converges in distribution to a Poisson random variable $ \mathbf{X} $ with mean $ \lambda $ .

Hint:

You may find the following fact useful:

$ \lim_{n\rightarrow\infty}\left(1+\frac{x}{n}\right)^{n}=e^{x}. $

Answer

Please see the example [CS1SequenceOfBinomiallyDistributedRV] that is identical to this problem.

3. (25 Points)

Let $ \mathbf{X}\left(t\right) $ be a real Gaussian random process with mean function $ \mu\left(t\right) $ and autocovariance function $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right) $ .

(a)

Write the expression for the $ n $ -th order characteristic function of $ \mathbf{X}\left(t\right) $ in terms of $ \mu\left(t\right) $ and $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right) $ .

ref.

There are the note about the n-th order characteristic function of Gaussians random process [CS1n-thOrderCharacteristicFunctionOfGaussian]. The only difference between the note and this problem is that this problem use the $ \mu\left(t\right) $ rather than $ \eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right] $ .

Solution

$ \Phi_{\mathbf{X}\left(t_{1}\right)\cdots\mathbf{X}\left(t_{n}\right)}\left(\omega_{1},\cdots,\omega_{n}\right)=\exp\left\{ i\sum_{k=1}^{n}\mu_{\mathbf{X}}\left(t_{k}\right)\omega_{k}-\frac{1}{2}\sum_{j=1}^{n}\sum_{k=1}^{n}C_{\mathbf{XX}}\left(t_{j},t_{k}\right)\omega_{j}\omega_{k}\right\} $ .

(b)

Show that the probabilistic description of $ \mathbf{X}\left(t\right) $ is completely characterized by $ \mu\left(t\right) $ and autocovariance function $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right) $ .

Solution

From (a), the characteristic function of $ \mathbf{X}\left(t\right) $ is specified completely in terms of $ \mu_{\mathbf{X}}\left(t\right) $ and $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right) $ . Thus, probabilistic description of $ \mathbf{X}\left(t\right) $ is completely characterized by the characteristic function.

Note

$ f_{\mathbf{X}}\left(x\right)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{-i\omega x}\Phi_{\mathbf{X}}\left(\omega\right)d\omega. $

(c)

Show that if $ \mathbf{X}\left(t\right) $ is wide-sense stationary then it is also strict-sense stationary.

Note

You can use the theorem and its proof [CS1WSSandGaussianisSSS] for solving this problem.

4. (25 Points)

Let $ \mathbf{X}_{1},\mathbf{X}_{2},\mathbf{X}_{3},\cdots $ be a sequence of independent, identically distributed random variables, each having Cauchy pdf $ f\left(x\right)=\frac{1}{\pi\left(1+x^{2}\right)}\;,\qquad-\infty<x<\infty. Let \mathbf{Y}_{n}=\frac{1}{n}\sum_{i=1}^{n}\mathbf{X}_{i}. $ Find the pdf of $ \mathbf{Y}_{n} $ . Describe how the pdf of $ \mathbf{Y}_{n} $ depends on $ n $ . Does the sequence $ \mathbf{Y}_{1},\mathbf{Y}_{2},\mathbf{Y}_{3},\cdots $ converge in distribution? If yes, what is the distribution of the random variable it converges to?

Note

You can see the definition of the converge in distribution [CS1ConvergeInDistribution]. Furthermore, you have to know the characteristic function of Cauchy distributed random varaible.

Solution

According to the characteristic function of Cauchy distributed random variable,

$ \Phi_{\mathbf{X}}\left(\omega\right)=e^{-\left|\omega\right|}. $

$ \Phi_{\mathbf{Y}_{n}}\left(\omega\right)=E\left[\exp\left\{ i\omega\mathbf{Y}_{n}\right\} \right]=E\left[\exp\left\{ i\frac{\omega}{n}\sum_{k=1}^{n}\mathbf{X}_{k}\right\} \right]=E\left[\prod_{k=1}^{n}\exp\left\{ i\frac{\omega}{n}\mathbf{X}_{k}\right\} \right] $$ =E\left[\exp\left\{ i\frac{\omega}{n}\mathbf{X}\right\} \right]^{n}=\Phi_{\mathbf{X}}\left(\frac{\omega}{n}\right)^{n}=\left[e^{-\left|\omega/n\right|}\right]^{n}=e^{-\left|\omega\right|}. $

$ f_{\mathbf{Y}_{n}}\left(\omega\right)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{-i\omega y}e^{-\left|\omega\right|}d\omega=\frac{1}{2\pi}\left[\int_{-\infty}^{0}e^{-i\omega y}e^{\omega}d\omega+\int_{0}^{\infty}e^{-i\omega y}e^{-\omega}d\omega\right] $$ =\frac{1}{2\pi}\left[\int_{-\infty}^{C}e^{\omega\left(1-iy\right)}+\int_{C}^{\infty}e^{-\omega\left(1+iy\right)}d\omega\right]=\frac{1}{2\pi}\left[\frac{1}{1-iy}e^{\omega\left(1-iy\right)}\biggl|_{-\infty}^{C}+\frac{-1}{1+iy}e^{-\omega\left(1+iy\right)}\biggl|_{C}^{\infty}\right] $$ =\frac{1}{2\pi}\left[\frac{1}{1-iy}+\frac{1}{1+iy}\right]=\frac{1}{2\pi}\left[\frac{1+iy+1-iy}{1+y^{2}}\right]=\frac{1}{2\pi}\cdot\frac{2}{1+y^{2}}=\frac{1}{\pi\left(1+y^{2}\right)}. $


Back to ECE600

Back to ECE 600 QE

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva