(8 intermediate revisions by the same user not shown)
Line 11: Line 11:
  
 
'''(i)'''
 
'''(i)'''
<br> '''Solution: ''' <br>
+
<br> '''Solution 1 : ''' <br>
 
To remove absolute values, each variable is separated into a positive component and a negative component.  For <math>x_1</math> for example:
 
To remove absolute values, each variable is separated into a positive component and a negative component.  For <math>x_1</math> for example:
  
Line 70: Line 70:
  
  
 +
Therefore, the optimal solution is <math>x_1^- = x_3^+ = 0,\ x_2^+ = 0,\ x_2^- = 1,</math> and any <math>x_1^+, x_3^- </math> satisfying <math>x_1^+ +x_3^- = 3, x_1^+, x_3^-\ge 0</math>.  Or in terms of the original variables: <math> x_2 = -1</math>, Any <math>x_1, x_3\ satisfying\ x_1 \ge 0, x_3 \le 0, |x_1| + |x_3| = 3</math>. 
  
 +
In this case the objective function is <math>-x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- = -4</math>. 
  
 +
<br> '''Solution 2 : ''' <br>
  
 +
<math>x_i = x_i^+ - x_i^-, \ \ \ \ x_i^+ ,x_i^- \ge 0</math>
  
<br>  
+
<math>|x_i| = x_i^+ + x_i^- </math>
 +
 
 +
Then the problem is converted into a linear programming problem:
 +
 
 +
<math>min x_1^+ x_1^- +x_2^+ +x_2^- +x_3^+ +x_3^- \\
 +
subject\ to \\
 +
\begin{bmatrix}
 +
  1 & -1 & 1 & -1 & -1 & 1 \\
 +
  0 & 0  & -1& 1  & 0  & 0
 +
\end{bmatrix} \begin{bmatrix}
 +
  x_1^+  \\
 +
  x_1^- \\
 +
  x_2^+ \\
 +
  x_2^- \\
 +
  x_3^+ \\
 +
  x_3^-
 +
\end{bmatrix} = \begin{bmatrix}
 +
  2  \\
 +
  1
 +
\end{bmatrix} \\
 +
x_1^+ ,x_1^- ,x_2^+ ,x_2^- ,x_3^+ ,x_3^- \ge 0
 +
</math>
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\
 +
\hline
 +
1 &-1 & 1 &-1 &-1 & 1 & 2\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1\\
 +
1 & 1 & 1 & 1 & 1 & 1 & 0\\
 +
\end{array} </math>
 +
 
 +
bring an artificial variable <math>x_4</math>
 +
 
 +
phase 1:
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\
 +
\hline
 +
1 &-1 & 1 &-1 &-1 & 1 & 0 & 2\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\
 +
0 & 0 & 0 & 0 & 0 & 0 & 1 & 0\\
 +
\end{array} </math>
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\
 +
\hline
 +
1 &-1 & 1 &-1 &-1 & 1 & 0 & 2\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\
 +
0 & 0 & 1 &-1 & 0 & 0 & 0 &-1\\
 +
\end{array} </math>
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\
 +
\hline
 +
1 &-1 & 0 & 0 &-1 & 1 & 1 & 3\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\
 +
0 & 0 & 0 & 0 & 0 & 0 & 1 & 0\\
 +
\end{array} </math>
 +
<br>
 +
 
 +
phase 2:
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\
 +
\hline
 +
1 &-1 & 0 & 0 &-1 & 1 & 3\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1\\
 +
1 & 1 & 1 & 1 & 1 & 1 & 0\\
 +
\end{array} </math>
 +
 
 +
 
 +
<math>\begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\
 +
\hline
 +
1 &-1 & 0 & 0 &-1 & 1 & 3\\
 +
0 & 0 &-1 & 1 & 0 & 0 & 1\\
 +
0 & 2 & 2 & 0 & 2 & 0 &-4\\
 +
\end{array} </math>
 +
 
 +
optimal solution: <math>x^* = [3, -1, 0]</math>,  cost = 4
  
 
'''(ii)'''
 
'''(ii)'''
<br> '''Solution: '''
+
<br> '''Solution 1: ''' <br>
 +
 
 +
The objective is the same as minimizing <math>x_1^+ +x_1^- +x_2^+ +x_2^- +x_3^+ +x_3^-</math>
 +
 
 +
Therefore using the Asymmetric Form of Duality, the dual problem is:
 +
 
 +
<math>maximize\ \lambda^T \begin{bmatrix}
 +
  2 \\
 +
  1
 +
\end{bmatrix} \\
 +
subject\ to \\
 +
\lambda^T \begin{bmatrix}
 +
  1 & -1 & 1 & -1 & -1 & 1 \\
 +
  0 & 0  & -1& 1  & 0  & 0
 +
\end{bmatrix} \le \begin{bmatrix}
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1
 +
\end{bmatrix} </math>
 +
 
 +
After simplifying, this becomes:
 +
 
 +
<math>maximize\ 2 \lambda_1 + \lambda_2 \\
 +
subject\ to \\
 +
|\lambda_1| \le 1 \\
 +
|\lambda_1 - \lambda_2| \le 1</math>
 +
 
 +
Therefore the optimal solution is <math>\lambda_1 = 1, \lambda_2 = 2 </math>.  In this case <math>2 \lambda_1 + \lambda_2 = 4</math>.
 +
 
 +
<br> '''Solution 2: ''' <br>
 +
 
 +
<math>max\ 2 \lambda_1 + \lambda_2 \Rightarrow min\ -2 \lambda_1 - \lambda_2 \\
 +
subject\ to \\
 +
\lambda^T \begin{bmatrix}
 +
  1 & -1 & 1 & -1 & -1 & 1 \\
 +
  0 & 0  & -1& 1  & 0  & 0
 +
\end{bmatrix} \le \begin{bmatrix}
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1 \\
 +
  1
 +
\end{bmatrix} </math>
  
The schema will be destroyed if and only if the 2nd or 4th symbol change.  Equivalently, the schema will not be destroyed if and only if both 2nd and the 4th symbols stay the same.  As those events are independent:
+
Introduce artificial variables and change the constraints to:
  
<math>P(Not\ destroyed) = P(2nd\ symbol\ does\ not\ change) \times P(4th\ symbol\ does\ not\ change)</math>
+
<math>
 +
\lambda_1 + \lambda_3 = 1 \\
 +
-\lambda_1 + \lambda_4 = 1 \\
 +
\lambda_1 - \lambda_2 + \lambda_5 = 1 \\
 +
-\lambda_1 + \lambda_2 + \lambda_6 = 1 \\
 +
-\lambda_1 + \lambda_7 = 1 \\
 +
\lambda_1 + \lambda_8 = 1 \\
 +
</math>
  
<math> = (1-0.1)\times(1-0.1) = 0.81 </math>
 
  
Therefore
+
<math>\begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\
 +
\hline
 +
1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\
 +
-1 & 0 & 0 & 1 & 0 & 0 & 0 & 0 & 1\\
 +
1 &-1 & 0 & 0 & 1 & 0 & 0 & 0 & 1\\
 +
-1 & 1 & 0 & 0 & 0 & 1 & 0 & 0 & 1\\
 +
-1 & 0 & 0 & 0 & 0 & 0 & 1 & 0 & 1\\
 +
1 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 1\\
 +
-2 &-1 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\
 +
\end{array} </math>
  
<math>P(destroyed) = 1 - 0.81 = 0.19 </math>
+
<math>\begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\
 +
\hline
 +
1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\
 +
0 & 0 & 1 & 1 & 0 & 0 & 0 & 0 & 2\\
 +
0 &-1 &-1 & 0 & 1 & 0 & 0 & 0 & 0\\
 +
0 & 1 & 1 & 0 & 0 & 1 & 0 & 0 & 2\\
 +
0 & 0 & 1 & 0 & 0 & 0 & 1 & 0 & 2\\
 +
0 & 0 &-1 & 0 & 0 & 0 & 0 & 1 & 0\\
 +
0 &-1 & 2 & 0 & 0 & 0 & 0 & 0 & 2\\
 +
\end{array} </math>
  
<br>  
+
<math>\begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\
 +
\hline
 +
1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\
 +
0 & 0 & 1 & 1 & 0 & 0 & 0 & 0 & 2\\
 +
0 & 0 & 0 & 0 & 1 & 0 & 0 & 0 & 2\\
 +
0 & 1 & 1 & 0 & 0 & 1 & 0 & 0 & 2\\
 +
0 & 0 & 1 & 0 & 0 & 0 & 1 & 0 & 2\\
 +
0 & 0 &-1 & 0 & 0 & 0 & 0 & 1 & 0\\
 +
0 & 0 & 3 & 0 & 0 & 1 & 0 & 0 & 4\\
 +
\end{array} </math>
  
 +
optimal solution: <math>\lambda^* = \begin{bmatrix}
 +
  1 \\
 +
  2
 +
\end{bmatrix}</math>,  cost = 4
  
 +
<br> '''Comment: ''' Solution 2 uses the formal procedure of Simplex Method, which is more complicated but would apply to more general cases.  Solution 1 is not as general but is simpler for the given problem.  They both have the same results. <br>
 
[[ QE2013 AC-3 ECE580|Back to QE2013 AC-3 ECE580]]
 
[[ QE2013 AC-3 ECE580|Back to QE2013 AC-3 ECE580]]

Latest revision as of 11:19, 25 March 2015


QE2013_AC-3_ECE580-3

Part 1,2,3,4,5

(i)
Solution 1 :
To remove absolute values, each variable is separated into a positive component and a negative component. For $ x_1 $ for example:

$ x_1 = x_1^+ - x_1^-, \ \ \ \ x_1^+ ,x_1^- \ge 0, \ at\ least\ one\ of\ x_1^+,\ x_1^-\ is\ 0 $

Then $ |x_1| = x_1^+ + x_1^- $

Therefore the given problem can be converted into a linear programming problem:

$ maximize -x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- \\ subject\ to \\ \begin{bmatrix} 1 & -1 & 1 & -1 & -1 & 1 \\ 0 & 0 & -1& 1 & 0 & 0 \end{bmatrix} \begin{bmatrix} x_1^+ \\ x_1^- \\ x_2^+ \\ x_2^- \\ x_3^+ \\ x_3^- \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \end{bmatrix} \\ x_1^+ ,x_1^- ,x_2^+ ,x_2^- ,x_3^+ ,x_3^- \ge 0 $

This problem shares the same solution of the original problem if for i = 1,2,3, at least one of $ x_i^+\ and\ x_i^-\ is\ 0. $

$ -x_2^+ +x_2^- = 1 \Rightarrow x_2^+ = 0,\ x_2^- = 1 $

Therefore the equality constraint can be simplified as

$ x_1^+ -x_1^- -x_3^+ +x_3^- = 3 $

There are 4 cases to consider:

Case 1: $ x_1^+ = x_3^+ = 0 $

This is equivalent to finding $ x_1^-, x_3^- \ge 0\ to\ maximize\ -x_1^- -x_3^-\ subject\ to\ -x_1^- +x_3^- = 3 $

Optimal solution is $ x_1^- = 0, x_3^- = 3 $, in this case $ -x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- = -4 $

The other 3 cases can be solved similarly:

Case 2: $ x_1^+ = x_3^- = 0 $

No solution as $ -x_1^- -x_3^+ = 3 $ cannot be satisfied (left side is non-positive).

Case 3: $ x_1^- = x_3^+ = 0 $

Every feasible solution is optimal in this case (i.e., any $ x_1^+, x_3^- $ satisfying $ x_1^+ +x_3^- = 3, x_1^+, x_3^-\ge 0 $), $ -x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- = -4 $

Case 4: $ x_1^- = x_3^- = 0 $

Optimal solution is $ x_1^+ = 3, x_3^+ = 0 $, in this case $ -x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- = -4 $


Therefore, the optimal solution is $ x_1^- = x_3^+ = 0,\ x_2^+ = 0,\ x_2^- = 1, $ and any $ x_1^+, x_3^- $ satisfying $ x_1^+ +x_3^- = 3, x_1^+, x_3^-\ge 0 $. Or in terms of the original variables: $ x_2 = -1 $, Any $ x_1, x_3\ satisfying\ x_1 \ge 0, x_3 \le 0, |x_1| + |x_3| = 3 $.

In this case the objective function is $ -x_1^+ -x_1^- -x_2^+ -x_2^- -x_3^+ -x_3^- = -4 $.


Solution 2 :

$ x_i = x_i^+ - x_i^-, \ \ \ \ x_i^+ ,x_i^- \ge 0 $

$ |x_i| = x_i^+ + x_i^- $

Then the problem is converted into a linear programming problem:

$ min x_1^+ x_1^- +x_2^+ +x_2^- +x_3^+ +x_3^- \\ subject\ to \\ \begin{bmatrix} 1 & -1 & 1 & -1 & -1 & 1 \\ 0 & 0 & -1& 1 & 0 & 0 \end{bmatrix} \begin{bmatrix} x_1^+ \\ x_1^- \\ x_2^+ \\ x_2^- \\ x_3^+ \\ x_3^- \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \end{bmatrix} \\ x_1^+ ,x_1^- ,x_2^+ ,x_2^- ,x_3^+ ,x_3^- \ge 0 $

$ \begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\ \hline 1 &-1 & 1 &-1 &-1 & 1 & 2\\ 0 & 0 &-1 & 1 & 0 & 0 & 1\\ 1 & 1 & 1 & 1 & 1 & 1 & 0\\ \end{array} $

bring an artificial variable $ x_4 $

phase 1:

$ \begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\ \hline 1 &-1 & 1 &-1 &-1 & 1 & 0 & 2\\ 0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 0\\ \end{array} $

$ \begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\ \hline 1 &-1 & 1 &-1 &-1 & 1 & 0 & 2\\ 0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\ 0 & 0 & 1 &-1 & 0 & 0 & 0 &-1\\ \end{array} $

$ \begin{array}{|c|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & x_4 & b\\ \hline 1 &-1 & 0 & 0 &-1 & 1 & 1 & 3\\ 0 & 0 &-1 & 1 & 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 0\\ \end{array} $

phase 2:

$ \begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\ \hline 1 &-1 & 0 & 0 &-1 & 1 & 3\\ 0 & 0 &-1 & 1 & 0 & 0 & 1\\ 1 & 1 & 1 & 1 & 1 & 1 & 0\\ \end{array} $


$ \begin{array}{|c|c|c|c|c|c|c|} x_1^+ & x_1^- & x_2^+ & x_2^- & x_3^+ & x_3^- & b\\ \hline 1 &-1 & 0 & 0 &-1 & 1 & 3\\ 0 & 0 &-1 & 1 & 0 & 0 & 1\\ 0 & 2 & 2 & 0 & 2 & 0 &-4\\ \end{array} $

optimal solution: $ x^* = [3, -1, 0] $, cost = 4

(ii)
Solution 1:

The objective is the same as minimizing $ x_1^+ +x_1^- +x_2^+ +x_2^- +x_3^+ +x_3^- $

Therefore using the Asymmetric Form of Duality, the dual problem is:

$ maximize\ \lambda^T \begin{bmatrix} 2 \\ 1 \end{bmatrix} \\ subject\ to \\ \lambda^T \begin{bmatrix} 1 & -1 & 1 & -1 & -1 & 1 \\ 0 & 0 & -1& 1 & 0 & 0 \end{bmatrix} \le \begin{bmatrix} 1 \\ 1 \\ 1 \\ 1 \\ 1 \\ 1 \end{bmatrix} $

After simplifying, this becomes:

$ maximize\ 2 \lambda_1 + \lambda_2 \\ subject\ to \\ |\lambda_1| \le 1 \\ |\lambda_1 - \lambda_2| \le 1 $

Therefore the optimal solution is $ \lambda_1 = 1, \lambda_2 = 2 $. In this case $ 2 \lambda_1 + \lambda_2 = 4 $.


Solution 2:

$ max\ 2 \lambda_1 + \lambda_2 \Rightarrow min\ -2 \lambda_1 - \lambda_2 \\ subject\ to \\ \lambda^T \begin{bmatrix} 1 & -1 & 1 & -1 & -1 & 1 \\ 0 & 0 & -1& 1 & 0 & 0 \end{bmatrix} \le \begin{bmatrix} 1 \\ 1 \\ 1 \\ 1 \\ 1 \\ 1 \end{bmatrix} $

Introduce artificial variables and change the constraints to:

$ \lambda_1 + \lambda_3 = 1 \\ -\lambda_1 + \lambda_4 = 1 \\ \lambda_1 - \lambda_2 + \lambda_5 = 1 \\ -\lambda_1 + \lambda_2 + \lambda_6 = 1 \\ -\lambda_1 + \lambda_7 = 1 \\ \lambda_1 + \lambda_8 = 1 \\ $


$ \begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\ \hline 1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\ -1 & 0 & 0 & 1 & 0 & 0 & 0 & 0 & 1\\ 1 &-1 & 0 & 0 & 1 & 0 & 0 & 0 & 1\\ -1 & 1 & 0 & 0 & 0 & 1 & 0 & 0 & 1\\ -1 & 0 & 0 & 0 & 0 & 0 & 1 & 0 & 1\\ 1 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 1\\ -2 &-1 & 0 & 0 & 0 & 0 & 0 & 0 & 0\\ \end{array} $

$ \begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\ \hline 1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\ 0 & 0 & 1 & 1 & 0 & 0 & 0 & 0 & 2\\ 0 &-1 &-1 & 0 & 1 & 0 & 0 & 0 & 0\\ 0 & 1 & 1 & 0 & 0 & 1 & 0 & 0 & 2\\ 0 & 0 & 1 & 0 & 0 & 0 & 1 & 0 & 2\\ 0 & 0 &-1 & 0 & 0 & 0 & 0 & 1 & 0\\ 0 &-1 & 2 & 0 & 0 & 0 & 0 & 0 & 2\\ \end{array} $

$ \begin{array}{|c|c|c|c|c|c|c|c|c|} \lambda_1 & \lambda_2 & \lambda_3 & \lambda_4 & \lambda_5 & \lambda_6 & \lambda_7 & \lambda_8 & c\\ \hline 1 & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 1\\ 0 & 0 & 1 & 1 & 0 & 0 & 0 & 0 & 2\\ 0 & 0 & 0 & 0 & 1 & 0 & 0 & 0 & 2\\ 0 & 1 & 1 & 0 & 0 & 1 & 0 & 0 & 2\\ 0 & 0 & 1 & 0 & 0 & 0 & 1 & 0 & 2\\ 0 & 0 &-1 & 0 & 0 & 0 & 0 & 1 & 0\\ 0 & 0 & 3 & 0 & 0 & 1 & 0 & 0 & 4\\ \end{array} $

optimal solution: $ \lambda^* = \begin{bmatrix} 1 \\ 2 \end{bmatrix} $, cost = 4


Comment: Solution 2 uses the formal procedure of Simplex Method, which is more complicated but would apply to more general cases. Solution 1 is not as general but is simpler for the given problem. They both have the same results.
Back to QE2013 AC-3 ECE580

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett