Главная » Просмотр файлов » Deturck, Wilf - Lecture Notes on Numerical Analysis

Deturck, Wilf - Lecture Notes on Numerical Analysis (523142), страница 25

Файл №523142 Deturck, Wilf - Lecture Notes on Numerical Analysis (Deturck, Wilf - Lecture Notes on Numerical Analysis) 25 страницаDeturck, Wilf - Lecture Notes on Numerical Analysis (523142) страница 252013-09-15СтудИзба
Просмтор этого файла доступен только зарегистрированным пользователям. Но у нас супер быстрая регистрация: достаточно только электронной почты!

Текст из файла (страница 25)

This condition yields a quadratic equation for λ whose two roots are λ = 2and λ = 4. These are the two eigenvalues of the matrix (3.7.2).For the same 2 × 2 example, let’s now find the eigenvectors (by a method that doesn’tbear the slightest resemblance to the numerical method that we will discuss later). First, tofind the eigenvector that belongs to the eigenvalue λ = 2, we go back to (3.7.3) and replaceλ by 2 to obtain the two equationsx 1 − x2 = 0(3.7.5)−x1 + x2 = 0.These equations are, of course, redundant since λ was chosen to make them so.

They aresatisfied by any vector x of the form c ∗ [1, 1], where c is an arbitrary constant. If we referback to the definition (3.7.1) of eigenvectors we notice that if x is an eigenvector then so iscx, so eigenvectors are determined only up to constant multiples. The first eigenvector ofour 2 × 2 matrix is therefore any multiple of the vector [1, 1].To find the eigenvector that belongs to the eigenvalue λ = 4, we return to (3.7.3), replaceλ by 4, and solve the equations. The result is that any scalar multiple of the vector [1, −1]is an eigenvector corresponding to the eigenvalue λ = 4.The two statements that [1, 1] is an eigenvector and that [1, −1] is an eigenvector caneither be written as two vector equations:"3 −1−13#"11#"=211#",3 −1−13#"1−1#"=41−1#(3.7.6)or as a single matrix equation"2 −1−13#"111 −1#"=111 −1#"2 00 4#.(3.7.7)110Numerical linear algebraObserve that the matrix equation (3.7.7) states that AP = P Λ, where A is the given2 × 2 matrix, P is a (nonsingular) matrix whose columns are eigenvectors of A, and Λ is thediagonal matrix that carries the eigenvalues of A down the diagonal (in order correspondingto the eigenvectors in the columns of P ).

This matrix equation AP = P Λ leads to one of themany important areas of application of the theory of eigenvalues, namely to the computationof functions of matrices.Suppose we want to calculate A2147 , where A is the 2 × 2 matrix (3.7.2). A directcalculation, by raising A to higher and higher powers would take quite a while (althoughnot as long as one might think at first sight! Exactly what powers of A would you compute?How many matrix multiplications would be required?).A better way is to begin with the relation AP = P Λ and to observe that in this casethe matrix P is nonsingular, and so P has an inverse. Since P has the eigenvectors ofA in its columns, the nonsingularity of P is equivalent to the linear independence of theeigenvectors. Hence we can writeA = P ΛP −1 .(3.7.8)This is called the spectral representation of A, and the set of eigenvalues is often called thespectrum of A.Equation (3.7.8) is very helpful in computing powers of A.

For instanceA2 = (P ΛP −1 )(P ΛP −1 ) = P Λ2 P −1 ,and for every m, Am = P Λm P −1 . It is of course quite easy to find high powers of thediagonal matrix Λ, because we need only raise the entries on the diagonal to that power.Thus for example,"2147A=111 −1#"221470042147#"1/21/21/2 −1/2#.(3.7.9)Not only can we compute powers from the spectral representation (3.7.8), we can equallywell obtain any polynomial in the matrix A.

For instance,13A3 + 78A19 − 43A31 = P (13Λ3 + 78Λ19 − 43Λ31 )P −1 .(3.7.10)Indeed if f is any polynomial, thenf (A) = P f (Λ)P −1(3.7.11)and f (Λ) is easy to calculate because it just has the numbers f (λi ) down the diagonal andzeros elsewhere.Finally, it’s just a short hop to the conclusion that (3.7.11) remains valid even if f isnot a polynomial, but is represented by an everywhere-convergent powers series (we don’teven need that much, but this statement suffices for our present purposes).

So for instance,if A is the above 2 × 2 matrix, theneA = P eΛ P −1(3.7.12)3.7 Eigenvalues and eigenvectors of matrices111where eΛ has e2 and e4 on its diagonal.We have now arrived at a very important area of application of eigenvalues and eigenvectors, to the solution of systems of differential equations.

A system of n linear simultaneous differential equations in n unknown functions can be written simply as y0 = Ay,with say y(0) given as initial data. The solution of this system of differential equations isy(t) = eAt y(0), where the matrix eAt is calculated by writing A = P ΛP −1 if possible, andthen putting eAt = P eΛt P −1 .Hence, whenever we can find the spectral representation of a matrix A, we can calculatefunctions of the matrix and can solve differential equations that involve the matrix.So, when can we find a spectral representation of a given n × n matrix A? If we canfind a set of n linearly independent eigenvectors for A, then all we need to do is to arrangethem in the columns of a new matrix P .

Then P will be invertible, and we’ll be all finished.Conversely, if we somehow have found a spectral representation of A à la (3.7.8), then thecolumns of P obviously do comprise a set of n independent eigenvectors of A.That changes the question. What kind of an n × n matrix A has a set of n linearlyindependent eigenvectors? This is quite a hard problem, and we won’t answer it completely.Instead, we give an example of a matrix that does not have as many independent eigenvectors as it “ought to,” and then we’ll specialize our discussion to a kind of matrix that isguaranteed to have a spectral representation.For an example we don’t have to look any further than"A=0 10 0#.(3.7.13)The reader will have no difficulty in checking that this matrix has just one eigenvalue, λ = 0,and that corresponding to that eigenvalue there is just one independent eigenvector, andtherefore there is no spectral representation of this matrix.Now first we’re going to devote our attention to the real symmetric matrices.

i.e., tomatrices A for which Aij = Aji for all i, j = 1, . . . , n. These matrices occur in manyimportant applications, and they always have a spectral representation. Indeed, much moreis true, as is shown by the following fundamental theorem of the subject, whose proofis deferred to section 3.9, where it will emerge (see Theorem 3.9.1) as a corollary of analgorithm.Theorem 3.7.1 (The Spectral Theorem) – Let A be an n × n real symmetric matrix. Thenthe eigenvalues and eigenvectors of A are real.

Furthermore, we can always find a set of neigenvectors of A that are pairwise orthogonal to each other (so they are surely independent).Recall that the eigenvectors of the symmetric 2 × 2 matrix (3.7.2) were [1, 1] and [1, −1],and these are indeed orthogonal to each other, though we didn’t comment on it at the time.We’re going to follow a slightly unusual route now, that will lead us simultaneously toa proof of the fundamental theorem (the “spectral theorem”) above, and to a very elegant112Numerical linear algebracomputer algorithm, called the method of Jacobi, for the computation of eigenvalues andeigenvectors of real symmetric matrices.In the next section we will introduce a very special family of matrices, first studied byJacobi, and we will examine their properties in some detail.

Once we understand theseproperties, a proof of the spectral theorem will appear, with almost no additional work.Following that we will show how the algorithm of Jacobi can be implemented on acomputer, as a fast and pretty program in which all of the eigenvalues and eigenvectorsof a real symmetric matrix are found simultaneously, and are delivered to your door as anorthogonal set.Throughout these algorithms certain themes will recur.

Specifically, we will see severalsituations in which we have to compute a certain angle and then carry out a rotation ofspace through that angle. Since the themes occur so often we are going to abstract fromthem certain basic modules of algorithms that will be used repeatedly.This choice will greatly simplify the preparation of programs, but at a price, namelythat each module will not always be exactly optimal in terms of machine time for executionin each application, although it will be nearly so.

Consequently it was felt that the pricewas worth the benefit of greater universality. We’ll discuss these points further, in context,as they arise.3.8The orthogonal matrices of JacobiA matrix P is called an orthogonal matrix if it is real, square, and if P −1 = P T , i.e., ifP T P = P P T = I. If we visualize the way a matrix is multiplied by its transpose, it will beclear that an orthogonal matrix is one in which each of the rows (columns) is a unit vectorand any two distinct rows (columns) are orthogonal to each other.For example, the 2 × 2 matrix"cos θ sin θ− sin θ cos θ#(3.8.1)is an orthogonal matrix for every real θ.We will soon prove that a real symmetric matrix always has a set of n pairwise orthogonaleigenvectors.

If we take such as set of vectors, normalize them by dividing each by its length,and arrange them in the consecutive columns of a matrix P , then P will be an orthogonalmatrix, and further we will have AP = P Λ. Since P T = P −1 , we can multiply on the rightby P T and obtainA = P ΛP T ,(3.8.2)and this is the spectral theorem for a symmetric matrix A.Conversely, if we can find an orthogonal matrix P such that P T AP is a diagonal matrixD, then we will have found a complete set of pairwise orthogonal eigenvectors of A (thecolumns of P ), and the eigenvalues of A (on the diagonal of D).3.8 The orthogonal matrices of Jacobi113In this section we are going to describe a numerical procedure that will find such anorthogonal matrix, given a real symmetric matrix A. As soon as we prove that the methodworks, we will have proved the spectral theorem at the same time.

Hence the method isof theoretical as well as algorithmic importance. It is important to notice that we willnot have to find an eigenvalue, then find a corresponding eigenvector, then find anothereigenvalue and another vector, etc. Instead, the whole orthogonal matrix whose columnsare the desired vectors will creep up on us at once.The first thing we have to do is to describe some special orthogonal matrices that willbe used in the algorithm.

Let n, p and q be given positive integers, with n ≥ 2 and p 6= q,and let θ be a real number. We define the matrix Jpq (θ) by saying that J is just like then × n identity matrix except that in the four positions that lie at the intersections of rowsand columns p and q we find the entries (3.8.1).More precisely, Jpq (θ) has in position [p, p] the entry cos θ, it has sin θ in the [p, q] entry,− sin θ in the [q, p] entry, cos θ in entry [q, q], and otherwise it agrees with the identitymatrix, as shown below:1 0 00 1 0..

Характеристики

Тип файла
PDF-файл
Размер
632,37 Kb
Тип материала
Учебное заведение
Неизвестно

Список файлов книги

Свежие статьи
Популярно сейчас
Зачем заказывать выполнение своего задания, если оно уже было выполнено много много раз? Его можно просто купить или даже скачать бесплатно на СтудИзбе. Найдите нужный учебный материал у нас!
Ответы на популярные вопросы
Да! Наши авторы собирают и выкладывают те работы, которые сдаются в Вашем учебном заведении ежегодно и уже проверены преподавателями.
Да! У нас любой человек может выложить любую учебную работу и зарабатывать на её продажах! Но каждый учебный материал публикуется только после тщательной проверки администрацией.
Вернём деньги! А если быть более точными, то автору даётся немного времени на исправление, а если не исправит или выйдет время, то вернём деньги в полном объёме!
Да! На равне с готовыми студенческими работами у нас продаются услуги. Цены на услуги видны сразу, то есть Вам нужно только указать параметры и сразу можно оплачивать.
Отзывы студентов
Ставлю 10/10
Все нравится, очень удобный сайт, помогает в учебе. Кроме этого, можно заработать самому, выставляя готовые учебные материалы на продажу здесь. Рейтинги и отзывы на преподавателей очень помогают сориентироваться в начале нового семестра. Спасибо за такую функцию. Ставлю максимальную оценку.
Лучшая платформа для успешной сдачи сессии
Познакомился со СтудИзбой благодаря своему другу, очень нравится интерфейс, количество доступных файлов, цена, в общем, все прекрасно. Даже сам продаю какие-то свои работы.
Студизба ван лав ❤
Очень офигенный сайт для студентов. Много полезных учебных материалов. Пользуюсь студизбой с октября 2021 года. Серьёзных нареканий нет. Хотелось бы, что бы ввели подписочную модель и сделали материалы дешевле 300 рублей в рамках подписки бесплатными.
Отличный сайт
Лично меня всё устраивает - и покупка, и продажа; и цены, и возможность предпросмотра куска файла, и обилие бесплатных файлов (в подборках по авторам, читай, ВУЗам и факультетам). Есть определённые баги, но всё решаемо, да и администраторы реагируют в течение суток.
Маленький отзыв о большом помощнике!
Студизба спасает в те моменты, когда сроки горят, а работ накопилось достаточно. Довольно удобный сайт с простой навигацией и огромным количеством материалов.
Студ. Изба как крупнейший сборник работ для студентов
Тут дофига бывает всего полезного. Печально, что бывают предметы по которым даже одного бесплатного решения нет, но это скорее вопрос к студентам. В остальном всё здорово.
Спасательный островок
Если уже не успеваешь разобраться или застрял на каком-то задание поможет тебе быстро и недорого решить твою проблему.
Всё и так отлично
Всё очень удобно. Особенно круто, что есть система бонусов и можно выводить остатки денег. Очень много качественных бесплатных файлов.
Отзыв о системе "Студизба"
Отличная платформа для распространения работ, востребованных студентами. Хорошо налаженная и качественная работа сайта, огромная база заданий и аудитория.
Отличный помощник
Отличный сайт с кучей полезных файлов, позволяющий найти много методичек / учебников / отзывов о вузах и преподователях.
Отлично помогает студентам в любой момент для решения трудных и незамедлительных задач
Хотелось бы больше конкретной информации о преподавателях. А так в принципе хороший сайт, всегда им пользуюсь и ни разу не было желания прекратить. Хороший сайт для помощи студентам, удобный и приятный интерфейс. Из недостатков можно выделить только отсутствия небольшого количества файлов.
Спасибо за шикарный сайт
Великолепный сайт на котором студент за не большие деньги может найти помощь с дз, проектами курсовыми, лабораторными, а также узнать отзывы на преподавателей и бесплатно скачать пособия.
Популярные преподаватели
Добавляйте материалы
и зарабатывайте!
Продажи идут автоматически
6439
Авторов
на СтудИзбе
306
Средний доход
с одного платного файла
Обучение Подробнее