He is the author of Topics in Matrix Analysis (Cambridge University Press ). Matrix analysis / Roger A. Horn, Charles R. Johnson. Download as PDF, TXT or read online from Scribd. Flag for Charles R. Johnson is the author of Topics in Matrix Analysis (Cambridge University Press ). Cambridge Core - Algebra - Matrix Analysis - by Roger A. Horn. Roger A. Horn, The Johns Hopkins University, Charles R. Johnson . PDF; Export citation.
|Language:||English, Indonesian, Dutch|
|Genre:||Children & Youth|
|ePub File Size:||27.84 MB|
|PDF File Size:||17.33 MB|
|Distribution:||Free* [*Registration needed]|
Request PDF on ResearchGate | Matrix Analysis, 2nd edition, Roger A. Horn and Charles R. Johnson, book review, Technometrics, 55, 3, PDF | Thesis (Ph. D.)--Johns Hopkins University, Vita. U.M.I. no. Includes bibliographical references (leaves ). Linear algebra and matrix theory are fundamental tools in mathematical and physical science Matrix analysis / Roger A. Horn, Charles R. Johnson. – 2nd ed.
For a given equivalence relation in matrix analysis. U and V are unitary. Readers should consult the index to find the definition of an unfamiliar term. Throughout the book. It is often useful to step back from a question about a given matrix to a question about some intrinsic property of the linear transformation of which the matrix is only one of many possible representations.
Nonzero vectors x such that Ax is a scalar multiple of x play a major role in analyzing the structure of a matrix or linear A second key concept in this chapter is the notion of eigenvector and eigenvalue. Every invertible matrix is a change-of-basis matrix. The notion of similarity is a key concept in this chapter. Similar but not identical matrices are therefore just different basis representations of a single linear transformation.
One would expect similar matrices to share many important properties — at least. The interplay between these two concepts of A. A fundamental concept in matrix analysis is the set of eigenvalues of a square complex matrix.
Topics in Matrix Analysis
If this system has a nontrivial solution. Explain why the standard basis vectors ei. It is a key element of the definition that an eigenvector can never be the zero vector.
Problems 1. Definition 1. With what eigenvalue is each eigenvector ei associated? Equation 1. In general.
Consider the n-vector e whose entries are all equal to 1. Find an eigenvector associated with the eigenvalue 5. Sometimes the structure of a matrix makes an eigenvector easy to perceive. If Exercise. Even if they had no other importance. Let Jn be the n-by-n matrix whose entries are all equal to 1.
A polynomial 1. Use eigenvectors. Let p t be a given polynomial of degree k. The factorization 1. The fundamental theorem of algebra Appendix C ensures that any monic polynomial 1. There is an alternative way to represent p A that has very important consequences.
Theorem 1. In the product 1. The first assertion in Theorem 1. Show that e2 is an eigenvector of A2 but not of A. Observation 1. The identity 1. Let m be the least integer k such that the vectors y.
Ak y are linearly dependent. Then A has an eigenvalue. In fact. Show that A has no eigenvectors other than scalar multiples of e1. What is A2? Show that e1 is an eigenvector of A and of A2.
Let a0. According to 1. P8 Explain how the argument in 1.
You must show three things: Explain why at least one of u. Give an example of a nonzero nilpotent matrix. Show that each eigenvalue of A is either 0 or 1. P6 Show that all eigenvalues of a nilpotent matrix are 0. Must both u and v be eigenvectors of A? Can A have a real eigenvector associated with an eigenvalue that is not real?
Explain why 0 is the only nilpotent idempotent matrix. Explain why I is the only nonsingular idempotent matrix.
In the next section. Equal to what? Verify that S is a linear transformation. P9 Use the definition 1. P10 Provide details for the following example. Each summand in the presentation 0.
Use this method to find eigenvectors of the matrix 1. Thought of as a formal polynomial in t. The degree of a summand can be n. Use 1. Show that x is an eigenvector of adj A. How may they be characterized in a systematic way?
4 editions of this work
Rewrite the eigenvalue—eigenvector equation 1. The remaining assertion is the equivalence of 1. How should we account for such repetitions in an enumeration of the eigenvalues? If each zero of p A t has multiplicity 1. Example 1. For clarity. If these two statements are to remain true even if some zeroes of p A t have multiplicity greater than 1.
If A is real. Let x. Using 0. When we refer to the distinct eigenvalues of A. Since we now know that each n-by-n complex matrix has finitely many eigenvalues. The sum of its principal minors of size k there are nk of them is denoted by E k A. Explain why. What if T is lower triangular?
What if T is diagonal? Sometimes the structure of a matrix makes the characteristic polynomial easy to calculate. Explain why the preceding exercise is a special case of this one.
This observation is the basis of many algorithms to compute eigenvalues. This is the case for diagonal or triangular matrices. A Use 0. Use 0.
If some eigenvalue of A is nonzero. This important fact often permits us to use continuity arguments to deduce results about singular matrices from properties of nonsingular matrices. A calculation with 1. The next theorem shows that a singular complex matrix can always be shifted slightly to become nonsingular. If all the eigenvalues of A are zero. P5 to show that every coefficient of p A t is an integer positive.
Conclude that the determinant function on Mn is similarity invariant.
Must it have multiplicity 1? P7 Use 1. What is the characteristic polynomial of a nilpotent matrix? P5 Use 1. P6 to show that the trace of a nilpotent matrix is 0. Consider the converse: For any A. An eigenvalue of a linear transformation T: P11 Let V be a vector space over a field F. P13 Let x. Sums of principal minors arose in our discussion of the coefficients of a characteristic polynomial. P17 Let A. Products of principal minors also arise in a natural way. P22 Consider the n-by-n circulant matrix Cn in 0.
Explain why det A is a positive integer.
All matrices in an equivalence class are similar. If B is similar to A. Some of these are mentioned here. Then a b c d A and B have the same eigenvalues. The crucial observation is that matrices in a similarity class share many important properties.
Each equivalence class is the set of all matrices in Mn similar to a given matrix. Similarity is an equivalence relation on Mn. Like any equivalence relation.
Verify the assertions in the preceding corollary.. If B is a diagonal matrix. Jordan canonical form is in Chapter 3. Show that q A and q B are similar. Suppose that A. Consider 00 10 and 00 If A is similar to a matrix of the form 1.
Explain why rank is a similarity invariant: The final assertions about the eigenvalues follow from an examination of the characteristic polynomials: The matrix A is diagonalizable if and only if there are n linearly independent vectors. Since diagonal matrices are especially simple and have very nice properties. Find all n of the eigenvalues of A. Then the vectors x 1. Since all the eigenvalues are distinct.
Let q t be a given polynomial. If A is diagonalizable. Diagonalizability is assured if all the eigenvalues are distinct. If q A is diagonalizable. If it were diagonalizable. Lemma 1. Show that 00 10 is not diagonalizable. If there are k linearly independent vectors in Cn.
The basis for this fact is the following important lemma about some of the eigenvalues. For the converse. Bd are all diagonalizable. Give an example of a diagonalizable matrix that does not have distinct eigenvalues. The latter observation can be generalized somewhat. Bd is diagonalizable. The converse is included in an earlier exercise. Diagonal matrices commute. F is irreducible. Then A and B commute if and only if they are simultaneously diagonalizable.
Assume that A and B commute. Central to our investigation is the notion of an invariant subspace and the companion notion of a block triangular matrix. Definitions 1. Since B is diagonalizable. Choose a basis s1. S has linearly independent columns. If W is A-invariant.
The former is the linear algebra side. But we can say a little more: The following lemma is at the heart of many subsequent results.
We summarize the foregoing discussion in the following observation. By the induction hypothesis. Tk of appropriate size. There is always a nonzero F-invariant subspace. If every matrix in F is a scalar matrix. If F is simultaneously diagonalizable. Using the argument in 1. Consider the nonzero F-invariant subspace W in the preceding proof. Let B. We prove the converse by induction on n.
Since W is Finvariant. Our next result shows that Theorem 1. Then F is a commuting family if and only if it is a simultaneously diagonalizable family. But W A. Consider the subspace W A. Using commutativity of F. We defer two important issues until Chapter 3: Then the direct sum 1. These important facts follow from a simple but very useful observation. Although AB and B A need not be the same and need not be the same size even when both products are defined.
The eigenvalues of C2 are the eigenvalues of B A together with m zeroes. Since the eigenvalues of C1 and C2 are the same. Eigenvalues of a low-rank matrix. Then the eigenvalues of A are the same as those of the r -by-r matrix Y T X. The eigenvalues of C1 are the eigenvalues of AB together with n zeroes. Here are just four. For b. If C is singular.
Since p t has only finitely many zeroes in the complex plane. If real matrices are similar via a complex matrix. If C is nonsingular. The following lemma is the key to answering such questions. Equating the real and imaginary parts of this identity shows that. Td ] are partitioned conformally to. Is there a real version of 1. The assertion c is a special case of a and b. We must prove its sufficiency. Corollary 1. Let L s. We proceed by induction. P1 Let A.
P2 If A. Does this contradict 1. Suppose that A and B are diagonalizable and commute. This provides a simple way to evaluate p A if one can diagonalize A.
P5 Give an example of two commuting matrices that are not simultaneously diagonalizable. Conclude that p A A is the zero matrix. Show that AB and B A are not similar but that they do have the same eigenvalues. Show that the list x1 1.
Conclude that A and B have a common eigenvector. P11 Provide details for the following alternative proof of 1. P12 Let A.
A2m y. See 1. A3m y. If AB 1. Select a maximal linearly independent set and explain why a common eigenvector for this finite set of matrices is a common eigenvector for F.
P8 If A. One direction is easy. Suppose that k is the smallest positive integer such that B k x is a linear combination of its predecessors. Consider the sequence of vectors x.
Is this true for two matrices that are not both diagonalizable? P13 Show that two diagonalizable matrices are similar if and only if their characteristic polynomials are the same. A is rank principal 0. If A is rank principal. P18 Suppose that A. What about the converse? Show that A and B are similar over C if and only if they are similar over R.
Deduce from this identity that the inverse of a nonsingular 2-by-2 block centrosymmetric matrix is 2-by-2 block centrosymmetric. P19 Let B. What more can you say about the eigenvalues if C is real? For a more precise statement.
P20 Represent any A. Show the following: The block matrix R1 A is an example of a real representation of A. P30 for a more precise statement. Deduce from this identity that the inverse of a real matrix of complex type is a matrix of complex type and that a product of real matrices of complex type is a matrix of complex type.
P29 for a generalization to a complex representation. M2n R. P22 Let A. Show that A and B are similar if and only if there are X. R2 A is nonsingular if and only if A is nonsingular. See 4. A is unitary if and only if R2 A is real orthogonal. P19 for the converse: The block matrix R2 A is a second example of a real representation of A. P26 Let e1. P25 Let x. Use this observation to construct an interesting 3-by-3 matrix with integer entries and eigenvalues 1.
Notice that A has integer entries if the entries of x. P27 Continuation of 1. Explain why the i. If A has real eigenvalues. Explain why det A is an integer.
Does f A depend on the choice of the diagonalizing similarity which is never unique? Use Theorem 1. Conclude that A is nonsingular if n is even. P31 Let a. Show that the vectors x. P34 If A. P36 Let A. B is the span of the set of all words in A and B 2. In the following. The algebra generated by A and B denoted by A A. A is irreducible. Show that A and B have no common eigenvector. Give a direct proof by exhibiting a basis of M2 consisting of words in A and B.
What can you conclude about it? One of the matrices in 1. P38 Let Jn be the all-ones matrix 0. The matrices A and B anticommute if ]A.
P40 The Jordan product of A. Notes and Further Readings: P41 e see M. Linear Algebra Appl. Monthly 87 Short proofs of theorems of Mirsky and Horn on diagonals and eigenvalues of matrices. Mirsky Matrices with prescribed off-diagonal elements. Israel J. Carlen and E. We begin with an important observation about eigenvalues.
Every invertible matrix is diagonally equivalent to a matrix with distinct eigenvalues. For alternative approaches. For a proof of the claim in 1. Linear Algebra 18 — P35 is adapted from I. Halperin and P. Lomonosov and P. If the algebraic multiplicity is 1. Theorems 1. Explain why a minimal A-invariant subspace an A-invariant subspace that contains no strictly lower-dimensional. A1 is nondefective.
A matrix is diagonalizable if and only if it is nondefective. We say that A is defective if the geometric multiplicity of some eigenvalue of A is strictly less than its algebraic multiplicity. If the geometric multiplicity of each eigenvalue of A is the same as its algebraic multiplicity.
Even though A and A T have the same eigenvalues. If each eigenvalue of A has geometric multiplicity 1. A2 is nondefective and derogatory. Explain why: Each type of eigenvector can convey different information about a matrix. We may assume that x is a unit vector. If necessary for clarity. One should not dismiss left eigenvectors as merely a parallel theoretical alternative to right eigenvectors.
S is nonsingular. We conclude that S is nonsingular. The 1. Let x be the first column of S. Consider the following three statements: The eigenvalues of a matrix are unchanged by similarity. One might also ask what happens if left and right eigenvectors associated with the same eigenvalue are either orthogonal or linearly dependent.
Then a implies b. Use the fact that a skew-symmetric matrix is rank principal 0.
Horn R.A., Johnson C.R. Topics in Matrix Analysis
In both cases a and b. Our approach to these results relies on the following lemma. If n is odd. P1 Let nonzero vectors x. In a we assume that the algebraic multiplicity is 1. What is the geometric multiplicity of the eigenvalue 0? Explain why any vector that is orthogonal to y is in the null space of A. Explain why every principal minor of A with odd size is singular.
See 8. Further eigenvalues and eigenvectors of A can be calculated by combining the power method with a deflation that delivers a square matrix of size one smaller. Use this observation to show that the eigenvalues of A are the eigenvalues of A11 together with those of A P8 Continue with the assumptions and notation of 1.
Another eigenvalue may be calculated from B and the deflation repeated. This construction is another type of. See 2. A is nonderogatory. P33 for a further example of a deflation. Then 1. For certain very special nonsingular matrices.
This triangular form can be further refined under general similarity. Similarity via a unitary matrix U. A list of vectors x1. The upper triangular form achievable under unitary similarity can be greatly refined under unitary equivalence and generalized to rectangular matrices: The unitary matrices in Mn form a remarkable and important set. Theorem 2. Show that every orthogonal list of nonzero vectors in Cn is linearly independent.
Definition 2. If y1. We list some of the basic equivalent conditions for U to be unitary in 2. Show that any nonzero subspace of Rn or Cn has an orthonormal basis 0. A linearly independent list need not be orthonormal. Verify that the matrices Q. The converse of each of these implications is similarly observed.
Every orthonormal list of vectors in Cn is linearly independent. For other kinds of isometries. A linear transformation T: To prove the converse. Use 2.
The set of unitary respectively. Observation 2. These are two different presentations. This says that the set of unitary matrices is a closed subset of Cn. Use b of 2. This group is generally referred to as the n-by-n unitary respectively. The set group of unitary matrices in Mn has another very important property. Verify 2. Interpret them geometrically.
If we think of the 2 set of unitary matrices as a subset of Cn. The unitary limit guaranteed by the lemma need not be unique. Show that there are two possible limits of subsequences.
Explain why the selection principle 2. We have already observed that if a sequence of unitary matrices converges to some matrix. Lemma 2. For our purposes. All that is required here is the fact that from any infinite sequence in a compact set.
There exists an infinite subsequence Uk1. Let U1. Example 2. Householder matrices. Plane rotations. If w is a unit vector.. Use the preceding lemma to show that a unitary matrix is upper triangular if and only if it is diagonal. Plane rotations and Householder matrices are special and very simple unitary matrices that play an important role in establishing some basic matrix factorizations.
Then U y. The assertions are readily verified if x and y are linearly dependent. If x and y are real. The situation is different in Cn. Show that a Householder matrix Uw is both unitary and Hermitian. Householder matrices and unitary scalar matrices can be used to construct a unitary matrix that takes any given vector in Cn into any other vector in Cn that has the same Euclidean norm. If x and y are linearly independent.
Show that the Householder matrix Uw is real orthogonal and symmetric. Construct U y. The following Q R factorization of a complex or real matrix is of considerable theoretical and computational importance.
This says that a lower triangular matrix equals an upper triangular matrix. If A has full column rank. This is the Cholesky factorization of B. Its main diagonal entries are r1. The assertion in c follows from the fact that a square matrix with orthonormal columns is unitary.
Explain why this factorization is unique if A is nonsingular. The final assertion e follows from the assurance in 2. If X and Y are real. Problems 2. An important geometrical fact is that any two lists containing equal numbers of orthonormal vectors are related via a unitary transformation. Extend each of the orthonormal lists x1. P4 Characterize the diagonal real orthogonal matrices.
That is. Show that every diagonal unitary matrix has this form. P14 Show that the intersection of the group of unitary matrices in Mn with the group of complex orthogonal matrices in Mn is the group of real orthogonal matrices in Mn. The smaller and compact group of real orthogonal matrices of a given size is often called the orthogonal group.
More generally. Show that: P5 Show that the permutation matrices 0. Compare with 2. Are the rows respectively. P6 Give a parametric presentation of the 3-by-3 orthogonal group. P13 Consider diag 2.
How many different permutation matrices are there in Mn? Two presentations of the 2-by-2 orthogonal group are given in the exercise following 2. P7 Suppose that A. Consider the Palais matrix Px. P16 Let x.
Given the basis x1. Explain why this process produces. P22 Suppose that X. How is this factorization related to the one in 2. P21 Explain why 2. What does a diagonal complex orthogonal matrix look like?
Deduce that a complex orthogonal matrix is upper triangular if and only if it is diagonal. Describe the steps of the Gram— Schmidt process applied to the columns of A. DePrima and C. This special type of similarity is called unitary similarity.
We say that A is unitarily diagonalizable if it is unitarily similar to a diagonal matrix. A is real orthogonally diagonalizable if it is real orthogonally similar to a diagonal matrix.
If U may be taken to be real and hence is real orthogonal. Show that unitary similarity is an equivalence relation. Further Reading. For more information about matrices that satisfy the conditions of 2. Then i. If this claim is true. We have now shown that any 2-by-2 complex matrix A is unitarily similar to a matrix with both diagonal entries equal to the average of the diagonal entries of A. Unitary similarity to a matrix with equal diagonal entries. Using the notation of 2.
The unitary similarity equivalence relation partitions Mn into finer equivalence classes than the similarity equivalence relation.
Unitary similarity. Unitary or real orthogonal similarity via a Householder matrix is often called a Householder transformation. For computational or theoretical reasons. Here are two examples. Show that the matrices similar. The following construction shows that A is unitarily similar to an upper Hessenberg matrix with nonnegative entries in its first subdiagonal. Construct U i. The unitary similarity U i. Since f is a continuous nonnegative-valued function on R A.
Repeat the construction. Let a1 be the first column of A. Unitary similarity to an upper Hessenberg matrix. The length of the word W s. Any finite formal product of nonnegative powers of s and t W s. For any word W s. A key role is played by the following simple notion.
P6 it may be useless in showing that two given matrices are unitarily. This similarity does not affect the first column of A1. A theorem of W. If one considers all possible words W s. If we take the word W s. It can be augmented with additional identities that collectively do provide necessary and sufficient conditions. If A is Hermitian or skew Hermitian. Let s. In practice. A and B are unitarily similar if and only if 2.
How can corresponding eigenvectors be obtained as a by-product of this process? Choose another plane rotation of the form U1. P8 Let A. Use Example 2. Show that no diagonal unitary similarity can take this matrix into its transpose.
Choose a plane rotation U1. P7 Give an example of two 2-by-2 matrices that satisfy the identity 2. P2 The eigenvalue calculation method of Givens for real matrices also uses plane rotations. Explain why this process does not disturb previously manufactured zero entries and why it preserves symmetry if A is symmetric. Apply this observation to the 3-by-3 matrix in the preceding problem and conclude that if it is unitarily similar to its transpose.
Use either 2. Then start on the second row beginning with the 2. Show that Fn is symmetric. Unitary equivalence to a complex symmetric matrix: Explain why Cn is unitary real orthogonal. Notes and Further Readings.
The list of words in 2. Hn is symmetric. What are the entries of Cn and Sn? P4 b are satisfied. For a survey of the issues addressed in 2. Algebra — We can restate this criterion as follows: If a circulant matrix is singular and has first row [a1.
A and B are unitarily similar if and only if tr W A. A 4-by-4 complex matrix is unitarily similar to its transpose if and only if seven zero-trace identities of the type in 2. There is an approximate version of 2. Zur Theorie der Matrizen II. For the original proof of 2. Any square complex matrix A is unitarily similar to a triangular matrix whose diagonal entries are the eigenvalues of A.
Schur triangularization. Linear Multilinear Algebra 55 — An approximate. Our proof involves a sequence of deflations by unitary similarities. Citing Literature Number of times cited according to CrossRef: An applied view on analysis and synthesis with convex formulation , Control Engineering Practice , Volume 67 , Issue 3 Pages Related Information.
Email or Customer ID. Forgot password?
Old Password. New Password. Your password has been changed. Returning user. Request Username Can't sign in?
Forgot your username? Enter your email address below and we will send you your username.Explain why det A is an integer. Algebra — Addition of a scalar multiple of one row to another row.
P9 Use the definition 1. How is this factorization related to the one in 2.
- FOOD MICROBIOLOGY BY ADAMS AND MOSS PDF
- EBOOK PERRYS CHEMICAL ENGINEERS HANDBOOK
- JAVATPOINT ANDROID TUTORIAL PDF
- THE DUNGEON SURVIVAL HANDBOOK PDF 4E
- BIOGRAPHY OF SUBHASH CHANDRA BOSE IN HINDI PDF
- ZAK BAGANS BOOK
- PRINCE2 STUDY GUIDE BY DAVID HINDE EBOOK
- EKNATH PATIL MPSC BOOKS PDF IN MARATHI
- FUNNY URDU NOVELS PDF
- COMPUTER ARCHITECTURE NOTES PDF
- ADOBE PDF READER FOR NOKIA N70
- LINK NOVEL INDONESIA PDF
- INTERMEDIATE ACCOUNTING 13TH EDITION PDF