(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[Category:ECE PhD Qualifying Exams]][[Category:ECE PhD Qualifying Exams]][[Category:ECE PhD Qualifying Exams]]
+
[[Category:ECE]]
 +
[[Category:QE]]
 +
[[Category:CNSIP]]
 +
[[Category:problem solving]]
 +
[[Category:automatic control]]
 +
[[Category:optimization]]
  
=QE2016_AC-3_ECE580=
+
<center>
 +
<font size= 4>
 +
[[ECE_PhD_Qualifying_Exams|ECE Ph.D. Qualifying Exam]]
 +
</font size>
  
 +
<font size= 4>
 +
Automatic Control (AC)
  
 +
Question 3: Optimization
 +
</font size>
  
Put your content here . . .
+
August 2016
 +
</center>
 +
----
 +
----
  
 +
----
 +
1.(20 pts) Considern the following linear program, <br/>
 +
<center> minimize <math>2x_{1} + x_{2}</math>, </center> <br/>
 +
<center> subject to <math>x_{1} + 3x_{2} \geq 6 </math> </center> <br/>
 +
<center> <math>2x_{1} + x_{2} \geq 4</math> </center> <br/>
 +
<center> <math> x_{1} + x_{2} \leq 3 </math> </center> <br/>
 +
<center> <math> x_{1} \geq 0 </math>, <math> x_{2} \geq 0 </math>. </center> <br/>
 +
Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form. <br/>
 +
:'''Click [[2016AC-3-1|here]] to view student [[2016AC-3-1|answers and discussions]]'''
 +
----
 +
2.(20 pts)
 +
*(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function <br/>
 +
<center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/>
 +
starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where  <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/>
 +
*(5 pts) Find the largest range of the step size, <math>\alpha</math>, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function<br/>
 +
<center><math> f= 6x_{1}^{2}+2x_{2}^{2}-5 </math></center> <br/>
 +
starting from an arbitrary initial condition <math>x^{(0)} \in \mathbb{R}^{n}</math>
 +
:'''Click [[2016AC-3-2|here]] to view student [[2016AC-3-2|answers and discussions]]'''
  
 +
----
 +
3. (20 pts) Is the function <br/>
 +
<center><math> f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} </math></center><br>
 +
locally convex, concave, or neither in the neighborhood of the point <math> [2 -1]^{T} </math>? Justify your answer by giving all the details of your argument.
 +
:'''Click [[2016AC-3-3|here]] to view student [[2016AC-3-3|answers and discussions]]'''
  
 +
----
 +
4. (20 pts) Solve the following optimization problem:
 +
<center>optimize <math> x_{1}x_{2} </math> </center><br>
 +
<center>subject to <math> x_{1}+x_{2}+x_{3}=1 </math> </center><br>
 +
<center><math> x_{1}+x_{2}-x_{3}=0 </math></center><br>
 +
:'''Click [[2016AC-3-4|here]] to view student [[2016AC-3-4|answers and discussions]]'''
 +
----
 +
5. (20 pts) Solve the following optimization problem:<br/>
 +
<center>maximize <math> 14x_{1}-x_{1}^{2}+6x_{2}-x_{2}^{2}+7 </math></center><br>
 +
<center>subject to <math>x_{1}+x_{2} \leq 2</math></center><br>
 +
<center><math> x_{1}+2x_{2} \leq 3 </math></center>
 +
:'''Click [[2016AC-3-5|here]] to view student [[2016AC-3-5|answers and discussions]]'''
 +
----
 +
----
  
[[ ECE PhD Qualifying Exams|Back to ECE PhD Qualifying Exams]]
+
 
 +
[[ECE_PhD_Qualifying_Exams|Back to ECE QE page]]

Latest revision as of 15:15, 19 February 2019


ECE Ph.D. Qualifying Exam

Automatic Control (AC)

Question 3: Optimization

August 2016




1.(20 pts) Considern the following linear program,

minimize $ 2x_{1} + x_{2} $,

subject to $ x_{1} + 3x_{2} \geq 6 $

$ 2x_{1} + x_{2} \geq 4 $

$ x_{1} + x_{2} \leq 3 $

$ x_{1} \geq 0 $, $ x_{2} \geq 0 $.

Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form.

Click here to view student answers and discussions

2.(20 pts)

  • (15 pts) FInd the largest range of the step-size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
$ f = \frac{1}{2} x^{T}Qx - b^{T}x $

starting from an arbitary initial condition $ x^{(0)} \in \mathbb{R}^{n} $, where $ x \in \mathbb{R}^{n} $, and $ Q = Q^{T} > 0 $.

  • (5 pts) Find the largest range of the step size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
$ f= 6x_{1}^{2}+2x_{2}^{2}-5 $

starting from an arbitrary initial condition $ x^{(0)} \in \mathbb{R}^{n} $

Click here to view student answers and discussions

3. (20 pts) Is the function

$ f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} $

locally convex, concave, or neither in the neighborhood of the point $ [2 -1]^{T} $? Justify your answer by giving all the details of your argument.

Click here to view student answers and discussions

4. (20 pts) Solve the following optimization problem:

optimize $ x_{1}x_{2} $

subject to $ x_{1}+x_{2}+x_{3}=1 $

$ x_{1}+x_{2}-x_{3}=0 $

Click here to view student answers and discussions

5. (20 pts) Solve the following optimization problem:

maximize $ 14x_{1}-x_{1}^{2}+6x_{2}-x_{2}^{2}+7 $

subject to $ x_{1}+x_{2} \leq 2 $

$ x_{1}+2x_{2} \leq 3 $
Click here to view student answers and discussions



Back to ECE QE page

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn