Line 33: | Line 33: | ||
---- | ---- | ||
2.(20 pts) | 2.(20 pts) | ||
− | *(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to | + | *(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function <br/> |
<center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/> | <center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/> | ||
starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/> | starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/> |
Latest revision as of 15:15, 19 February 2019
Automatic Control (AC)
Question 3: Optimization
August 2016
1.(20 pts) Considern the following linear program,
Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form.
- Click here to view student answers and discussions
2.(20 pts)
- (15 pts) FInd the largest range of the step-size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
starting from an arbitary initial condition $ x^{(0)} \in \mathbb{R}^{n} $, where $ x \in \mathbb{R}^{n} $, and $ Q = Q^{T} > 0 $.
- (5 pts) Find the largest range of the step size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
starting from an arbitrary initial condition $ x^{(0)} \in \mathbb{R}^{n} $
- Click here to view student answers and discussions
3. (20 pts) Is the function
locally convex, concave, or neither in the neighborhood of the point $ [2 -1]^{T} $? Justify your answer by giving all the details of your argument.
- Click here to view student answers and discussions
4. (20 pts) Solve the following optimization problem:
- Click here to view student answers and discussions
5. (20 pts) Solve the following optimization problem:
- Click here to view student answers and discussions