Heath - Scientific Computing (523150), страница 34
Текст из файла (страница 34)
The algebraic multiplicity ofan eigenvalue is its multiplicity as a root of the characteristic polynomial. An eigenvalueof algebraic multiplicity 1 is said to be simple. The geometric multiplicity of an eigenvalueis the number of linearly independent eigenvectors corresponding to that eigenvalue. Thegeometric multiplicity of an eigenvalue cannot exceed the algebraic multiplicity, but it canbe less than the algebraic multiplicity.
An eigenvalue with the latter property is said to bedefective. Similarly, an n×n matrix that has fewer than n linearly independent eigenvectorsis said to be defective.Although the eigenvalues are not necessarily real, complex eigenvalues of a real matrixmust occur in complex conjugatepairs (i.e., if α + iβ is an eigenvalue of a real matrix, then√so is α − iβ, where i = −1).4.1.3Properties of Eigenvalue ProblemsSome properties of an eigenvalue problem that affect the choice of algorithm and softwareto solve it are as follows:•••••Are all of the eigenvalues needed, or only a few?Are only the eigenvalues needed, or are the corresponding eigenvectors also needed?Is the matrix real, or complex?Is the matrix relatively small and dense, or large and sparse?Does the matrix have any special properties, such as symmetry, or is it a general matrix?118CHAPTER 4.
EIGENVALUES AND SINGULAR VALUESTable 4.1: Some properties ofPropertySymmetricHermitianOrthogonalUnitaryNormalmatrices relevant to eigenvalue problemsDefinitionA = ATA = AHAT A = AAT = IAH A = AAH = IAH A = AAHSome properties that a square matrix may have that are relevant to eigenvalue problemsare defined in Table 4.1 (see also Section 2.5).Example 4.3 Matrix Properties. The following examples illustrate some of the matrixproperties relevant to eigenvalue problems:T 1 21 3Transpose:=,3 42 4H 1−i 2+i1 + i 1 + 2iConjugate transpose:=,2 − i 2 − 2i1 − 2i 2 + 2i1 21 3, nonsymmetric:,Symmetric:2 32 411+i11+iHermitian:, nonHermitian:,1+i21−i20 1−101 1Orthogonal:,, nonorthogonal:,1 00 −11 2√√ √ √2/2 √2/2i√2/22/2√√Orthogonal:, unitary:,− 2/22/2− 2/2 −i 2/21 2 01 1Normal: 0 1 2 , nonnormal:.0 12 0 14.1.4Similarity TransformationsIn keeping with our general strategy, many numerical methods for computing eigenvaluesand eigenvectors are based on reducing the original matrix to a simpler form, whose eigenvalues and eigenvectors are then easily determined.
Thus, we need to identify what typesof transformations preserve eigenvalues, and for what types of matrices the eigenvalues areeasily determined.A matrix B is similar to a matrix A if there is a nonsingular matrix T such thatB = T −1 AT .4.1. EIGENVALUES AND EIGENVECTORS119ThenBy = λy⇒T −1 AT y = λy⇒A(T y) = λ(T y),so that A and B have the same eigenvalues, and if y is an eigenvector of B, then x = T y isan eigenvector of A. Thus, similarity transformations preserve eigenvalues, and, althoughthey do not preserve eigenvectors, the eigenvectors are still easily recovered.
Note that theconverse is not true: two matrices that are similar must have the same eigenvalues, but twomatrices that have the same eigenvalues are not necessarily similar.Example 4.4 Similarity Transformation. From the eigenvalues and eigenvectors forone of the matrices in Example 4.1, we see that 2 03 −11111= T Λ,AT ==−131 −11 −10 4and henceT−10.5AT =0.50.5−0.53−1−13 112=1 −100= Λ,4so that the original matrix is similar to the diagonal matrix, and in this case the eigenvectorsform the columns of the transformation matrix T .The eigenvalues of a diagonal matrix are its diagonal entries, and the eigenvectors arethe corresponding columns of the identity matrix I.
Thus, diagonal form is a desirabletarget in simplifying eigenvalue problems for general matrices by similarity transformations.Unfortunately, some matrices cannot be transformed into diagonal form by a similaritytransformation. The best that can be done, in general, is Jordan form, in which the matrixis reduced nearly to diagonal form but may yet have a few nonzero entries on the firstsuperdiagonal, corresponding to one or more multiple eigenvalues.Fortunately, every matrix can be transformed into triangular form—called Schur formin this context—by a similarity transformation, and the eigenvalues of a triangular matrixare also the diagonal entries, for A − λI must have a zero on its diagonal if A is triangularand λ is any diagonal entry of A. The eigenvectors of a triangular matrix are not quite soobvious but are still straightforward to compute.
IfU11 u U13A − λI = o0 vT O o U33is triangular, then the system U11 y = u can be solved for y, so thatyx = −1 ois an eigenvector. (We have assumed that U11 is nonsingular, which means that we areworking with the first occurrence of λ on the diagonal.)120CHAPTER 4. EIGENVALUES AND SINGULAR VALUESThe simplest form attainable by a similarity transformation, as well as the type ofsimilarity transformation, depends on the properties of the given matrix. We obviouslyprefer the simpler diagonal form when possible, and we also prefer orthogonal (or unitary) similarity transformations when possible, for both theoretical and numerical reasons.Unfortunately, not all matrices are unitarily diagonalizable, and some matrices are not diagonalizable at allTable 4.2 indicates what form is attainable for a given type of matrixand a given type of similarity transformation.
Given a matrix A with one of the properties indicated, there exist matrices B and T having the indicated properties such thatB = T −1 AT . In the first four cases, the columns of T are the eigenvectors. In all cases,the diagonal entries of B are the eigenvalues.Table 4.2: Forms attainableADistinct eigenvaluesReal symmetricComplex HermitianNormalArbitraryArbitrary4.1.5by similarity transformations for various types of matricesTNonsingularOrthogonalUnitaryUnitaryUnitaryNonsingularBDiagonalReal diagonalReal diagonalDiagonalUpper triangular (Schur form)Almost diagonal (Jordan form)Conditioning of Eigenvalue ProblemsThe condition of an eigenvalue problem is the sensitivity of the eigenvalues and eigenvectorsto small changes in the matrix.
The condition of a matrix eigenvalue problem is not thesame as the condition of the matrix for solving linear equations. Different eigenvalues oreigenvectors of a given matrix are not necessarily equally sensitive to perturbations in thematrix.The condition of a simple eigenvalue λ of a matrix A is given by 1/|y H x|, where x and yare corresponding right and left eigenvectors normalized so that xH x = y H y = 1. In otherwords, the sensitivity of a simple eigenvalue is proportional to the reciprocal of the cosineof the angle between the corresponding left and right eigenvectors.
Thus, a perturbation oforder in A may perturb the eigenvalue λ by as much as /|y H x|. The sensitivity of aneigenvector depends on both the sensitivity of the corresponding eigenvalue and the distanceof that eigenvalue from other eigenvalues.For a symmetric or Hermitian matrix, the right and left eigenvectors are the same,so we have y H x = xH x = 1, and hence the eigenvalues are inherently well-conditioned.More generally, the eigenvalues are well-conditioned for normal matrices, but for nonnormalmatrices the eigenvalues need not be well-conditioned. In particular, multiple or close eigenvalues can be poorly conditioned and therefore difficult to compute accurately, especiallyif the matrix is defective.
Balancing—scaling by a diagonal similarity transformation—canimprove the condition of an eigenvalue problem, and many software packages for eigenvalueproblems offer such an option.4.2. METHODS FOR COMPUTING ALL EIGENVALUES4.24.2.1121Methods for Computing All EigenvaluesCharacteristic PolynomialPerhaps the most obvious method for computing the eigenvalues of a matrix A is by meansof its characteristic polynomial,det(A − λI) = 0.This is not recommended as a general numerical procedure, however, because the coefficients of the characteristic polynomial are not well-determined numerically, and its rootscan be very sensitive to perturbations in the coefficients. Moreover, solving for the roots ofa polynomial of high degree requires a great deal of work.
In other words, the characteristicpolynomial gives an equivalent problem in theory, but in practice the solution is not preserved numerically; and in any case, computing the roots of the polynomial is no simplerthan the original eigenvalue problem. Indeed, one of the better ways of computing the rootsof a polynomial p(λ) = a0 + a1 λ + · · · + an−1 λn−1 + λn is to compute the eigenvalues of thecompanion matrix010···0 001···0 .......