(17 intermediate revisions by 4 users not shown)
Line 2: Line 2:
  
 
Question from student regarding HW #3:
 
Question from student regarding HW #3:
HW 3 Question:  Do to scheduled work travel next week I am trying to get ahead on HW #3.  I keep getting stuck on the Eigenvalues.  When the directions state to find the spectrum, is it asking for the spectrum of the original matrix or the spectrum of the symmetric, skew-symmetric, orthoganal, hermitian, skew-hermitian, or unitary matrix?  I have worked the problems trying to get the spectrum of the original matrix and I am not having much succcess.  Thank you.
 
  
HWK #3,4,7 on page 329 question: We would want to use variables in these matrices correct? Other wise certain components would indicate a vector space and some would not. Am I correct in stating this?
+
When the directions state to find the spectrum, is it asking for the spectrum of the original matrix or the spectrum of the symmetric, skew-symmetric, orthoganal, hermitian, skew-hermitian, or unitary matrix?
 +
 
 +
Answer from Bell:
 +
 
 +
You'll find the definition of the spectrum on page 334. It is just another word for the set of all eigenvalues.  The problem is asking you to find the
 +
spectrum for the original matrix.  Then determine the type of the matrix and verify if the eigenvalues satisfy the conditions of Theorems 1 and 5.
  
 
Question from student regarding HW#3:
 
Question from student regarding HW#3:
How much detail is sufficient on page 339, problem 30?  Any help with this problem is appreciated.
+
 
 +
How much detail is sufficient on page 339, problem 30?
 +
 
 +
Answer from Bell:
 +
 
 +
You'll need to explain in words, using results from the book as evidence to back up your statements. It will take three or four sentences to explain
 +
and your explanation should convince anyone who has read the book that you
 +
are correct.  If somebody else in the class reads your explanation, they
 +
should say, oh yes!  That makes perfect sense.
 +
 
 +
Question from student:
 +
 
 +
I am stuck on where to start with HW3, lesson 7, #30, the proof that the inverse of a square matrix exists iff no eigenvalues are zero.
 +
 
 +
Answer from Bell:
 +
 
 +
Start with the definition of an eigenvalue.  If r is an eigenvalue, then
 +
 
 +
det(A-rI)=0.
 +
 
 +
If r=0, what does that equation say about the matrix A?
 +
 
 +
Question:
 +
 
 +
Is this an alternative approach?:
 +
Assume A^-1 exist and multiply both sides of equation (3) from p. 335 by A^-1 to get A^-1 * [(A-rI)x] = A^-1 * 0.  Then ask what is A^-1? I got a solution that says A^-1 = (1/r)*I but this cannot be right.....can it?    --[[User:Rayala|Rayala]] 18:32, 13 September 2010 (UTC)
 +
 
 +
Answer:
 +
 
 +
In order to say that A^(-1) exists, you'll need to know that the determinant
 +
of A is non-zero.  Once you establish that, when you multiply as you suggest,
 +
be sure to distribute through the sum properly and use the rules of scalar and matrix multiplication.  When I do it, I get
 +
 
 +
<math>A^{-1}(A-rI)x=\left(A^{-1}A-A^{-1}(rI)\right)x=(I - r A^{-1})x=0.</math>
 +
 
 +
So
 +
 
 +
<math>rA^{-1}x = Ix=x.</math>
 +
 
 +
Now divide by r.
 +
 
 +
Question from student:
 +
 
 +
Problem 11 on p 348 is asking if the matrix is symmetric, skew-symmetric, or orthogonal. I found the eigenvalues to be a repeated real root, but the matrix is neither of the 3.
 +
 
 +
Answer from Bell:
 +
 
 +
That is correct.  The book makes the point that if the matrix does not satisfy
 +
the conditions of the theorems, then the theorems do not apply.  The eigenvalues may or may not satisfy any special conditions.  You have done what the problem asks for if you add the clause, "Theorems 1 and 5 do not apply."
 +
 
 +
Question from student:
 +
 
 +
Problem 12 has some really nasty eigenvalues.  Am I doing something wrong?
 +
 
 +
Answer from Bell:
 +
 
 +
You'll be surprised how nice it all turns out.  Be sure to use the trig identity
 +
 
 +
 
 +
<math>\cos^2\theta+\sin^2\theta=1.</math>
 +
 
 +
You'll need to use it once to simplify a cos^2 + sin^2 term, and again
 +
to clean up the square root of cos^2-1.  (You'll get complex eigenvalues that way.)
 +
 
 +
Question:
 +
 
 +
Page 348, problem 11: the book has a part of the answer as "defect 1", what does this mean?
 +
 
 +
Answer:
 +
 
 +
You'll find the definition of "defect" at the bottom of page 337.  It is
 +
the algebraic multiplicity minus the geometric multiplicity.  A matrix with
 +
any positive defects is called defective.  (Really.)
 +
 
 +
Question:
 +
 
 +
I'm having trouble finding the eigenvalues for the 4x4 matrix of
 +
p. 338: 23.  Any tips?
 +
 
 +
Answer:
 +
 
 +
Notice that the matrix is lower triangular.  Hence, when you subtract
 +
lambda down the diagonal and take the determinant, you
 +
get the product of the diagonal entries -- all factored
 +
and ready to go.  The roots are just the diagonal entries
 +
of the matrix.
 +
 
 +
Question:
 +
 
 +
I have a question regarding diagonalization of a matrix.  The X matrix is
 +
determined from the eigenvectors, but how do you know which order to use
 +
them?  I've noticed that the diagonal matrix is different depending on which
 +
order I choose.  Can a matrix have more than one "diagonalized" matrix?  Or
 +
are these different matrices I calculate somehow related?
 +
 
 +
Answer:
 +
 
 +
You can put those eigenvectors in any order
 +
(and if you don't normalize them, they can
 +
be multiplied by non-zero constants, too).
 +
Hence, X is not uniquely determined.  It
 +
must, however, be a matrix whose columns
 +
are a linearly independent set of  n  eigenvectors
 +
for the matrix.  (If an nxn matrix does not have
 +
a full set of n independent eigenvectors, it is
 +
not diagonalizable.)
 +
 
 +
The diagonalized matrix is unique up to the
 +
order the eigenvalues appear down the diagonal.
 +
 
 +
It's like choosing which coordinates to label
 +
x_1, x_2, x_3, etc.  It is an arbitrary choice.
 +
 
 +
Question:
 +
 
 +
How can I find the roots of the characteristic
 +
polynomial for p. 348, #13?  It is a mess.
 +
 
 +
Answer:
 +
 
 +
I ran the MAPLE commands
 +
 
 +
<PRE>
 +
with(linalg);
 +
a := matrix([[14-r,4,-2],[4,14-r,2],[-2,2,17-r]]);
 +
det(a);
 +
solve( det(a)=0, r);
 +
</PRE>
 +
 
 +
and found the roots to be 18,18,9.  After that, I noticed
 +
that if I add the second row of  (A - rI)  to the first
 +
row, I don't change the value of the determinant, but I
 +
get the top row in the determinant to be
 +
 
 +
<PRE>
 +
18-r  18-r  0
 +
</PRE>
 +
 
 +
so that if I expand along that row, I get an  18-r  factored out nicely
 +
and the rest is easy.
 +
 
 +
Question:
 +
 
 +
Do I really have to compute the inverse of the matrix for p. 348, #12
 +
to determine if the matrix is an orthogonal matrix?
 +
 
 +
Answer:
 +
 
 +
Theorem 3 on page 347 says that a matrix is orthogonal if and only
 +
if the rows form an orthonormal basis.  Checking that those three
 +
row vectors are orthonormal is pretty easy.
 +
 
 +
Another way to check it is to verify that  A * A^T = I.
 +
That's not hard either.  (But don't calculate A^(-1) from scratch.
 +
That would be way too much work.)
  
 
[[2010 MA 527 Bell|Back to the MA 527 start page]]  
 
[[2010 MA 527 Bell|Back to the MA 527 start page]]  
Line 14: Line 172:
  
 
[[Category:MA5272010Bell]]
 
[[Category:MA5272010Bell]]
 
I am stuck on where to start with HW3, lesson 7, #30, the proof that the inverse of a square matrix exists iff no eigenvalues are zero.
 

Latest revision as of 08:01, 15 September 2010

Homework 3 collaboration area

Question from student regarding HW #3:

When the directions state to find the spectrum, is it asking for the spectrum of the original matrix or the spectrum of the symmetric, skew-symmetric, orthoganal, hermitian, skew-hermitian, or unitary matrix?

Answer from Bell:

You'll find the definition of the spectrum on page 334. It is just another word for the set of all eigenvalues. The problem is asking you to find the spectrum for the original matrix. Then determine the type of the matrix and verify if the eigenvalues satisfy the conditions of Theorems 1 and 5.

Question from student regarding HW#3:

How much detail is sufficient on page 339, problem 30?

Answer from Bell:

You'll need to explain in words, using results from the book as evidence to back up your statements. It will take three or four sentences to explain and your explanation should convince anyone who has read the book that you are correct. If somebody else in the class reads your explanation, they should say, oh yes! That makes perfect sense.

Question from student:

I am stuck on where to start with HW3, lesson 7, #30, the proof that the inverse of a square matrix exists iff no eigenvalues are zero.

Answer from Bell:

Start with the definition of an eigenvalue. If r is an eigenvalue, then

det(A-rI)=0.

If r=0, what does that equation say about the matrix A?

Question:

Is this an alternative approach?: Assume A^-1 exist and multiply both sides of equation (3) from p. 335 by A^-1 to get A^-1 * [(A-rI)x] = A^-1 * 0. Then ask what is A^-1? I got a solution that says A^-1 = (1/r)*I but this cannot be right.....can it? --Rayala 18:32, 13 September 2010 (UTC)

Answer:

In order to say that A^(-1) exists, you'll need to know that the determinant of A is non-zero. Once you establish that, when you multiply as you suggest, be sure to distribute through the sum properly and use the rules of scalar and matrix multiplication. When I do it, I get

$ A^{-1}(A-rI)x=\left(A^{-1}A-A^{-1}(rI)\right)x=(I - r A^{-1})x=0. $

So

$ rA^{-1}x = Ix=x. $

Now divide by r.

Question from student:

Problem 11 on p 348 is asking if the matrix is symmetric, skew-symmetric, or orthogonal. I found the eigenvalues to be a repeated real root, but the matrix is neither of the 3.

Answer from Bell:

That is correct. The book makes the point that if the matrix does not satisfy the conditions of the theorems, then the theorems do not apply. The eigenvalues may or may not satisfy any special conditions. You have done what the problem asks for if you add the clause, "Theorems 1 and 5 do not apply."

Question from student:

Problem 12 has some really nasty eigenvalues. Am I doing something wrong?

Answer from Bell:

You'll be surprised how nice it all turns out. Be sure to use the trig identity


$ \cos^2\theta+\sin^2\theta=1. $

You'll need to use it once to simplify a cos^2 + sin^2 term, and again to clean up the square root of cos^2-1. (You'll get complex eigenvalues that way.)

Question:

Page 348, problem 11: the book has a part of the answer as "defect 1", what does this mean?

Answer:

You'll find the definition of "defect" at the bottom of page 337. It is the algebraic multiplicity minus the geometric multiplicity. A matrix with any positive defects is called defective. (Really.)

Question:

I'm having trouble finding the eigenvalues for the 4x4 matrix of p. 338: 23. Any tips?

Answer:

Notice that the matrix is lower triangular. Hence, when you subtract lambda down the diagonal and take the determinant, you get the product of the diagonal entries -- all factored and ready to go. The roots are just the diagonal entries of the matrix.

Question:

I have a question regarding diagonalization of a matrix. The X matrix is determined from the eigenvectors, but how do you know which order to use them? I've noticed that the diagonal matrix is different depending on which order I choose. Can a matrix have more than one "diagonalized" matrix? Or are these different matrices I calculate somehow related?

Answer:

You can put those eigenvectors in any order (and if you don't normalize them, they can be multiplied by non-zero constants, too). Hence, X is not uniquely determined. It must, however, be a matrix whose columns are a linearly independent set of n eigenvectors for the matrix. (If an nxn matrix does not have a full set of n independent eigenvectors, it is not diagonalizable.)

The diagonalized matrix is unique up to the order the eigenvalues appear down the diagonal.

It's like choosing which coordinates to label x_1, x_2, x_3, etc. It is an arbitrary choice.

Question:

How can I find the roots of the characteristic polynomial for p. 348, #13? It is a mess.

Answer:

I ran the MAPLE commands

with(linalg);
a := matrix([[14-r,4,-2],[4,14-r,2],[-2,2,17-r]]);
det(a);
solve( det(a)=0, r);

and found the roots to be 18,18,9. After that, I noticed that if I add the second row of (A - rI) to the first row, I don't change the value of the determinant, but I get the top row in the determinant to be

18-r   18-r   0

so that if I expand along that row, I get an 18-r factored out nicely and the rest is easy.

Question:

Do I really have to compute the inverse of the matrix for p. 348, #12 to determine if the matrix is an orthogonal matrix?

Answer:

Theorem 3 on page 347 says that a matrix is orthogonal if and only if the rows form an orthonormal basis. Checking that those three row vectors are orthonormal is pretty easy.

Another way to check it is to verify that A * A^T = I. That's not hard either. (But don't calculate A^(-1) from scratch. That would be way too much work.)

Back to the MA 527 start page

To Rhea Course List

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang