m |
|||
(One intermediate revision by the same user not shown) | |||
Line 21: | Line 21: | ||
---- | ---- | ||
---- | ---- | ||
− | |||
---- | ---- | ||
Line 31: | Line 30: | ||
<center> <math> x_{1} \geq 0 </math>, <math> x_{2} \geq 0 </math>. </center> <br/> | <center> <math> x_{1} \geq 0 </math>, <math> x_{2} \geq 0 </math>. </center> <br/> | ||
Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form. <br/> | Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form. <br/> | ||
− | + | :'''Click [[2016AC-3-1|here]] to view student [[2016AC-3-1|answers and discussions]]''' | |
---- | ---- | ||
2.(20 pts) | 2.(20 pts) | ||
− | *(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to | + | *(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function <br/> |
<center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/> | <center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/> | ||
starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/> | starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/> | ||
Line 40: | Line 39: | ||
<center><math> f= 6x_{1}^{2}+2x_{2}^{2}-5 </math></center> <br/> | <center><math> f= 6x_{1}^{2}+2x_{2}^{2}-5 </math></center> <br/> | ||
starting from an arbitrary initial condition <math>x^{(0)} \in \mathbb{R}^{n}</math> | starting from an arbitrary initial condition <math>x^{(0)} \in \mathbb{R}^{n}</math> | ||
− | + | :'''Click [[2016AC-3-2|here]] to view student [[2016AC-3-2|answers and discussions]]''' | |
---- | ---- | ||
Line 46: | Line 45: | ||
<center><math> f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} </math></center><br> | <center><math> f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} </math></center><br> | ||
locally convex, concave, or neither in the neighborhood of the point <math> [2 -1]^{T} </math>? Justify your answer by giving all the details of your argument. | locally convex, concave, or neither in the neighborhood of the point <math> [2 -1]^{T} </math>? Justify your answer by giving all the details of your argument. | ||
− | + | :'''Click [[2016AC-3-3|here]] to view student [[2016AC-3-3|answers and discussions]]''' | |
---- | ---- | ||
Line 53: | Line 52: | ||
<center>subject to <math> x_{1}+x_{2}+x_{3}=1 </math> </center><br> | <center>subject to <math> x_{1}+x_{2}+x_{3}=1 </math> </center><br> | ||
<center><math> x_{1}+x_{2}-x_{3}=0 </math></center><br> | <center><math> x_{1}+x_{2}-x_{3}=0 </math></center><br> | ||
− | + | :'''Click [[2016AC-3-4|here]] to view student [[2016AC-3-4|answers and discussions]]''' | |
---- | ---- | ||
5. (20 pts) Solve the following optimization problem:<br/> | 5. (20 pts) Solve the following optimization problem:<br/> | ||
Line 59: | Line 58: | ||
<center>subject to <math>x_{1}+x_{2} \leq 2</math></center><br> | <center>subject to <math>x_{1}+x_{2} \leq 2</math></center><br> | ||
<center><math> x_{1}+2x_{2} \leq 3 </math></center> | <center><math> x_{1}+2x_{2} \leq 3 </math></center> | ||
− | + | :'''Click [[2016AC-3-5|here]] to view student [[2016AC-3-5|answers and discussions]]''' | |
---- | ---- | ||
---- | ---- |
Latest revision as of 15:15, 19 February 2019
Automatic Control (AC)
Question 3: Optimization
August 2016
1.(20 pts) Considern the following linear program,
Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form.
- Click here to view student answers and discussions
2.(20 pts)
- (15 pts) FInd the largest range of the step-size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
starting from an arbitary initial condition $ x^{(0)} \in \mathbb{R}^{n} $, where $ x \in \mathbb{R}^{n} $, and $ Q = Q^{T} > 0 $.
- (5 pts) Find the largest range of the step size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
starting from an arbitrary initial condition $ x^{(0)} \in \mathbb{R}^{n} $
- Click here to view student answers and discussions
3. (20 pts) Is the function
locally convex, concave, or neither in the neighborhood of the point $ [2 -1]^{T} $? Justify your answer by giving all the details of your argument.
- Click here to view student answers and discussions
4. (20 pts) Solve the following optimization problem:
- Click here to view student answers and discussions
5. (20 pts) Solve the following optimization problem:
- Click here to view student answers and discussions