Using MATLAB (779505), страница 42
Текст из файла (страница 42)
But the computations involving thebackslash and slash operators are preferable because they require lesscomputer time, less memory, and have better error detection properties.PseudoinversesRectangular matrices do not have inverses or determinants. At least one of theequations AX = I and XA = I does not have a solution. A partial replacement forthe inverse is provided by the Moore-Penrose pseudoinverse, which is computedby the pinv function.X = pinv(C)X =0.1159-0.0534The matrixQ = X*C11-22-0.07290.11520.01710.0418Inverses and DeterminantsQ =1.00000.00000.00001.0000is the 2-by-2 identity, but the matrixP = C*XP =0.8293-0.19580.3213-0.19580.77540.36850.32130.36850.3952is not the 3-by-3 identity. However, P acts like an identity on a portion of thespace in the sense that P is symmetric, P*C is equal to C and X*P is equal to X.If A is m-by-n with m > n and full rank n, then each of the three statementsx = A\bx = pinv(A)*bx = inv(A'*A)*A'*btheoretically computes the same least squares solution x, although thebackslash operator does it faster.However, if A does not have full rank, the solution to the least squares problemis not unique.
There are many vectors x that minimizenorm(A*x -b)The solution computed by x = A\b is a basic solution; it has at most r nonzerocomponents, where r is the rank of A. The solution computed by x = pinv(A)*bis the minimal norm solution because it minimizes norm(x). An attempt tocompute a solution with x = inv(A'*A)*A'*b fails because A'*A is singular.Here is an example to illustrates the various solutions.A =[ 1 2 34 5 67 8 910 11 12 ]11-2311Matrices and Linear Algebradoes not have full rank.
Its second column is the average of the first and thirdcolumns. Ifb = A(:,2)is the second column, then an obvious solution to A*x = b is x = [0 1 0]'. Butnone of the approaches computes that x. The backslash operator givesx = A\bWarning: Rank deficient, rank = 2.x =0.500000.5000This solution has two nonzero components. The pseudoinverse approach givesy = pinv(A)*by =0.33330.33330.3333There is no warning about rank deficiency. But norm(y) = 0.5774 is less thannorm(x) = 0.7071. Finallyz = inv(A'*A)*A'*bfails completely.Warning: Matrix is singular to working precision.z =InfInfInf11-24Cholesky, LU, and QR FactorizationsCholesky, LU, and QR FactorizationsMATLAB’s linear equation capabilities are based on three basic matrixfactorizations:• Cholesky factorization for symmetric, positive definite matrices• LU factorization (Gaussian elimination) for general square matrices• QR (orthogonal) for rectangular matricesThese three factorizations are available through the chol, lu, and qr functions.All three of these factorizations make use of triangular matrices where all theelements either above or below the diagonal are zero.
Systems of linearequations involving triangular matrices are easily and quickly solved usingeither forward or back substitution.Cholesky FactorizationThe Cholesky factorization expresses a symmetric matrix as the product of atriangular matrix and its transposeA = R′Rwhere R is an upper triangular matrix.Not all symmetric matrices can be factored in this way; the matrices that havesuch a factorization are said to be positive definite. This implies that all thediagonal elements of A are positive and that the offdiagonal elements are “nottoo big.” The Pascal matrices provide an interesting example.
Throughout thischapter, our example matrix A has been the 3-by-3 Pascal matrix. Let’stemporarily switch to the 6-by-6.A = pascal(6)A =11111112345613610152114102035561515357012616215612625211-2511Matrices and Linear AlgebraThe elements of A are binomial coefficients. Each element is the sum of itsnorth and west neighbors. The Cholesky factorization isR = chol(A)R =10000011000012100013310014641015101051The elements are again binomial coefficients. The fact that R'*R is equal to Ademonstrates an identity involving sums of products of binomial coefficients.Note The Cholesky factorization also applies to complex matrices. Anycomplex matrix which has a Cholesky factorization satisfies A’ = A and is saidto be Hermitian positive definite.The Cholesky factorization allows the linear systemAx = bto be replaced byR′Rx = bBecause the backslash operator recognizes triangular systems, this can besolved in MATLAB quickly withx = R\(R'\b)If A is n-by-n, the computational complexity of chol(A) is O(n3), but thecomplexity of the subsequent backslash solutions is only O(n2).LU FactorizationLU factorization, or Gaussian elimination, expresses any square matrix A asthe product of a permutation of a lower triangular matrix and an uppertriangular matrix11-26Cholesky, LU, and QR FactorizationsA = LUwhere L is a permutation of a lower triangular matrix with ones on its diagonaland U is an upper triangular matrix.The permutations are necessary for both theoretical and computationalreasons.
The matrix0 11 0cannot be expressed as the product of triangular matrices withoutinterchanging its two rows. Although the matrixε 11 0can be expressed as the product of triangular matrices, when ε is small theelements in the factors are large and magnify errors, so even though thepermutations are not strictly necessary, they are desirable. Partial pivotingensures that the elements of L are bounded by one in magnitude and that theelements of U are not much larger than those of A.For example[L,U] = lu(B)L =1.00000.37500.500000.54411.000001.000008.0000001.00008.500006.0000-1.00005.2941U =11-2711Matrices and Linear AlgebraThe LU factorization of A allows the linear systemA*x = bto be solved quickly withx = U\(L\b)Determinants and inverses are computed from the LU factorization usingdet(A) = det(L)*det(U)andinv(A) = inv(U)*inv(L)You can also compute the determinants using det(A) = prod(diag(U)),though the signs of the determinants may be reversed.QR FactorizationAn orthogonal matrix, or a matrix with orthonormal columns, is a real matrixwhose columns all have unit length and are perpendicular to each other.
If Qis orthogonal, thenQ′Q = 1The simplest orthogonal matrices are two-dimensional coordinate rotations.cos ( θ )– sin ( θ )sin ( θ )cos ( θ )For complex matrices, the corresponding term is unitary. Orthogonal andunitary matrices are desirable for numerical computation because theypreserve length, preserve angles, and do not magnify errors.The orthogonal, or QR, factorization expresses any rectangular matrix as theproduct of an orthogonal or unitary matrix and an upper triangular matrix. Acolumn permutation may also be involved.A = QRorA P= QR11-28Cholesky, LU, and QR Factorizationswhere Q is orthogonal or unitary, R is upper triangular, and P is apermutation.There are four variants of the QR factorization– full or economy size, and withor without column permutation.Overdetermined linear systems involve a rectangular matrix with more rowsthan columns, that is m-by-n with m > n. The full size QR factorizationproduces a square, m-by-m orthogonal Q and a rectangular m-by-n uppertriangular R.[Q,R] = qr(C)Q =-0.8182-0.1818-0.54550.3999-0.8616-0.3126R =-11.000000-8.5455-7.48170-0.4131-0.47390.7777In many cases, the last m - n columns of Q are not needed because they aremultiplied by the zeros in the bottom portion of R.
So the economy size QRfactorization produces a rectangular, m-by-n Q with orthonormal columns anda square n-by-n upper triangular R. For our 3-by-2 example, this is not muchof a saving, but for larger, highly rectangular matrices, the savings in both timeand memory can be quite important.[Q,R] = qr(C,0)Q =-0.8182-0.1818-0.54550.3999-0.8616-0.3126R =-11.00000-8.5455-7.481711-2911Matrices and Linear AlgebraIn contrast to the LU factorization, the QR factorization does not require anypivoting or permutations. But an optional column permutation, triggered bythe presence of a third output argument, is useful for detecting singularity orrank deficiency. At each step of the factorization, the column of the remainingunfactored matrix with largest norm is used as the basis for that step.
Thisensures that the diagonal elements of R occur in decreasing order and that anylinear dependence among the columns is almost certainly be revealed byexamining these elements. For our small example, the second column of C hasa larger norm than the first, so the two columns are exchanged.[Q,R,P] = qr(C)Q =-0.3522-0.7044-0.61630.8398-0.52850.1241R =-11.357800-8.27627.24600-0.4131-0.47390.7777P =0110When the economy size and column permutations are combined, the thirdoutput argument is a permutation vector, rather than a permutation matrix.[Q,R,p] = qr(C,0)11-30Q =-0.3522-0.7044-0.61630.8398-0.52850.1241R =-11.35780-8.27627.2460Cholesky, LU, and QR Factorizationsp =21The QR factorization transforms an overdetermined linear system into anequivalent triangular system.















