Line 20: Line 20:
 
a) From the Optimization textbook, Zak Stanislaw. Lemma 8.3<br>
 
a) From the Optimization textbook, Zak Stanislaw. Lemma 8.3<br>
 
For fixed step gradient descent algorithms <math>\alpha</math> should in the range <math>(0,\dfrac{2}{\lambda max(Q)})</math><br>
 
For fixed step gradient descent algorithms <math>\alpha</math> should in the range <math>(0,\dfrac{2}{\lambda max(Q)})</math><br>
b) <math>f(x)-\dfrac{1}{2}x^TQx-b^Tx=\dfrac{1}{2}x^T\begin{bmatrix} 12 & 0 \\ 0 & 4 \end{bmatrix}-5</math><br>
+
b) <math>Q=\begin{bmatrix} 12 & 0 \\ 0 & 4 \end{bmatrix}</math><br>
 
such that <math>\lambda max(Q)=12 \Rightarrow \alpha \in (0, \dfrac{1}{6})</math><br>  
 
such that <math>\lambda max(Q)=12 \Rightarrow \alpha \in (0, \dfrac{1}{6})</math><br>  
 
----
 
----

Latest revision as of 15:19, 19 February 2019


ECE Ph.D. Qualifying Exam

Automatic Control (AC)

Question 3: Optimization

August 2016 Problem 2


Solution

a) From the Optimization textbook, Zak Stanislaw. Lemma 8.3
For fixed step gradient descent algorithms $ \alpha $ should in the range $ (0,\dfrac{2}{\lambda max(Q)}) $
b) $ Q=\begin{bmatrix} 12 & 0 \\ 0 & 4 \end{bmatrix} $
such that $ \lambda max(Q)=12 \Rightarrow \alpha \in (0, \dfrac{1}{6}) $


Back to QE AC question 3, August 2016

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood