Главная » Просмотр файлов » Higham - Accuracy and Stability of Numerical Algorithms

Higham - Accuracy and Stability of Numerical Algorithms (523152), страница 85

Файл №523152 Higham - Accuracy and Stability of Numerical Algorithms (Higham - Accuracy and Stability of Numerical Algorithms) 85 страницаHigham - Accuracy and Stability of Numerical Algorithms (523152) страница 852013-09-15СтудИзба
Просмтор этого файла доступен только зарегистрированным пользователям. Но у нас супер быстрая регистрация: достаточно только электронной почты!

Текст из файла (страница 85)

In this chapterwe describe in detail the use of direct search optimization for investigatingquestions about the stability and accuracy of algorithms. We also describeinterval analysis and survey other forms of automatic error analysis.24.1. Exploiting Direct Search OptimizationIs Algorithm X numerically stable? How large can the growth factor be forGaussian elimination (GE) with pivoting strategy P? By how much can condition estimator C underestimate the condition number of a matrix? Thesetypes of questions are common, as we have seen in this book. Usually, weattempt to answer such questions by a combination of theoretical analysisand numerical experiments with random and nonrandom data. But a thirdapproach can be a valuable supplement to the first two: phrase the questionas an optimization problem and apply a direct search method.A direct search method for the problem(24.1)is a numerical method that attempts to locate a maximizing point using function values only and does not use or approximate derivatives of f.

Such methods are usually based on heuristics that do not involve assumptions about thefunction f. Various direct search methods have been developed; for surveyssee Powell [838, 1970] and Swarm [978, 1972], [979, 1974]. Most of these methods were developed in the 1960s, in the early years of numerical optimization.For problems in which f is smooth, direct search methods have largely beensupplanted by more sophisticated optimization methods that use derivatives(such as quasi-Newton methods and conjugate gradient methods), but theycontinue to find use in applications where f is not differentiable, or evennot continuous.

These applications range from chemical analysis [881, 1977],where direct search methods have found considerable use, to the determination of drug doses in the treatment of cancer [93, 1991]; in both applicationsthe evaluation of f is affected by experimental errors.

Lack of smoothness off, and the difficulty of obtaining derivatives when they exist, are characteristicof the optimization problems we consider here.The use of direct search can be illustrated with the example of the growth24.1 E XPLOITING D IRECT S EARCH O PTIMIZATION475factor for GE on Awhere theare the intermediate elements generated during the elimination.We know from 9.2 that the growth factor governs the stability of GE, so fora given pivoting strategy we would like to know how big ρ n (A) can be.To obtain an optimization problem of the form (24.1) we let x = vet(A)and we define f(x) := ρn(A).

Then we wish to determineSuppose, first, that no pivoting is done. Then f is defined and continuous atall points where the elimination does not break down, and it is differentiableexcept at points where there is a tie for the maximum in the numerator ordenominator of the expression defining ρn(A). We took n = 4 and appliedthe direct search maximizer MDS (described in 24.2) to f(x), starting withthe identity matrix A = I4. After 11 iterations and 433 function evaluations,the maximizer converged18 , having located the matrix19for which ρ4(B) = 1.23 x 105. (The large growth is a consequence of thesubmatrix B(1:3, 1:3) being ill conditioned; B itself is well conditioned.) Thusthe optimizer readily shows that ρn(A) can be very large for GE withoutpivoting.Next, consider GE with partial pivoting (GEPP).

Because the eliminationcannot break down, f is now defined everywhere, but it is usually discontinuous when there is a tie in the choice of pivot element, because then an arbitrarily small change in A can alter the pivot sequence. We applied the maximizerMDS to f, this time starting with the orthogonal matrix20 Awithaij =(cf. (9.11)), for which ρ4(A) = 2.32.After 29 iterations and 1169 function evaluations the maximizer converged toa matrix B with ρ 4(B) = 5.86.

We used this matrix to start the maximizer18In the optimizations of this section we used the convergence tests described in 24.2with tol = 10-3. There is no guarantee that when convergence is achieved it is to a localmaximum; see 24.2.19A11 numbers quoted are rounded to the number of significant figures shown.20This matrix is orthog (n, 2) from the Test Matrix Toolbox; see Appendix E.476A UTOMATIC E RROR A NALYSISAD (described in 24.2); it took 5 iterations and 403 function evaluations toconverge to the matrixfor which ρ4(C) = 7.939. This is one of the matrices described in Theorem 9.6,modulo the convergence tolerance.These examples, and others presented below, illustrate the following attractions of using direct search methods to investigate the stability of a numerical computation.(1) The simplest possible formulation of optimization problem is often sufficient to yield useful results.

Derivatives are not needed, and direct searchmethods tend to be insensitive to lack of smoothness in the objective function f. Unboundedness of f is a favorable property-direct search methodsusually quickly locate large values of f.(2) Good progress can often be made from simple starting values, such asan identity matrix. However, prior knowledge of the problem may providea good starting value that can be substantially improved (as in the partialpivoting example).(3) Usually it is the global maximum of f in (24.1) that is desired (althoughit is often sufficient to know that f can exceed a specified value). When adirect search method converges it will, in general, at best have located alocal maximum-and in practice the maximizer may simply have stagnated,particularly if a slack convergence tolerance is used.

However, further progresscan often be made by restarting the same (or a different) maximizer, as in thepartial pivoting example. This is because for methods that employ a simplex(such as the MDS method), the behaviour of the method starting at x0 isdetermined not just by x0. but also by the n + 1 vectors in the initial simplexconstructed at x0 .(4) The numerical information revealed by direct search provides a startingpoint for further theoretical analysis. For example, the GE experiments abovestrongly suggest the (well known) results that ρ n(A) is unbounded withoutpivoting and bounded by 2n–1 for partial pivoting, and inspection of thenumerical data suggests the methods of proof.When applied to smooth problems the main disadvantages of direct searchmethods are that they have at best a linear rate of convergence and they areunable to determine the nature of the point at which they terminate (sincederivatives are not calculated).

These disadvantages are less significant forthe problems we consider, where it is not necessary to locate a maximumto high accuracy and objective functions are usually nonsmooth. (Note thatthese disadvantages are not necessarily shared by methods that implicitly or24.2 D IRECT S EARCH M ETHODS477explicitly estimate derivatives using function values, such as methods basedon conjugate directions, for which see Powell [838, 1970], [839, 1975]; however,these are not normally regarded as direct search methods.)A final attraction of direct search is that it can be used to test the correctness of an implementation of a stable algorithm. The software in questioncan be used in its original form and does not have to be translated into someother representation.24.2.

Direct Search MethodsFor several years I have been using MATLAB implementations of three directsearch methods. The first is the alternating directions (AD) method (alsoknown as the coordinate search method). Given a starting value x it attemptsto solve the problem (24.1) by repeatedly maximizing over each coordinatedirection in turn:repeat% One iteration comprises a loop over all components of x.for i = l:nfind α such that f(x + αei ) is maximized (line search)Set x x + αe ienduntil convergedAD is one of the simplest of all optimization methods and the fundamentalweakness that it ignores any interactions between the variables is well known.Despite the poor reputation of AD we have found that it can perform wellon the types of problems considered here.

In our MATLAB implementation ofAD the line search is done using a crude scheme that begins by evaluatingf(x + hei ) with h = 10 -4xi (or h = 1 0 - 4 m a xif xi = 0); if f(x +he i ) < f(x) then the sign of h is reversed. Then if f(x + he i ) > f(x), his doubled at most 25 times until no further increase in f is obtained. Ourconvergence test checks for a sufficient relative increase in f between oneiteration and the next: convergence is declared whenf k – f k - 1 < t o l|fk-1|,(24.2)where fk is the highest function value at the end of the k th iteration.

The ADmethod has the very modest storage requirement of just a single n-vector.The second method is the multidirectional search method (MDS) of Dennisand Torczon [1008, 1989], [1009, 1991], [301, 1991]. This method employs asimplex, which is defined by n + 1 vectorsOne iteration in thecase n = 2 is represented pictorially in Figure 24.1, and may be explained asfollows.478A UTOMATIC E RROR A NALYSISFigure 24.1.

The possible steps in one iteration of the MDS method when n = 2.The initial simplex is {v0, v1, v2} and it is assumed that f(v0) = max i f(vi ).The purpose of an iteration is to produce a new simplex at one of whose vertices f exceeds f(v0 ). In the first step the vertices v1 and v2 are reflectedabout v0 .

along the lines joining them to v0 , yielding rl and r2 and the reflected simplex {v 0 ,r 1 ,r 2 }. If this reflection step is successful, that is, ifmaxi f(ri ) > f(v0), then the edges from v0 to ri are doubled in length togive an expanded simplex {v0, el, e2 }. The original simplex is then replacedby {v0, el, e2 } if maxi f(ei ) > maxi f(ri ), and otherwise by {v0, r1, r2 }. Ifthe reflection step is unsuccessful then the edges v0 – vi of the original simplex are shrunk to half their length to give the contracted simplex {v0, c1, c2}.This becomes the new simplex if max i f(ci ) > maxi f(vi ), in which case thecurrent iteration is complete; otherwise the algorithm jumps back to the reflection step, now working with the contracted simplex.

Характеристики

Тип файла
PDF-файл
Размер
6,84 Mb
Тип материала
Учебное заведение
Неизвестно

Список файлов книги

Свежие статьи
Популярно сейчас
Зачем заказывать выполнение своего задания, если оно уже было выполнено много много раз? Его можно просто купить или даже скачать бесплатно на СтудИзбе. Найдите нужный учебный материал у нас!
Ответы на популярные вопросы
Да! Наши авторы собирают и выкладывают те работы, которые сдаются в Вашем учебном заведении ежегодно и уже проверены преподавателями.
Да! У нас любой человек может выложить любую учебную работу и зарабатывать на её продажах! Но каждый учебный материал публикуется только после тщательной проверки администрацией.
Вернём деньги! А если быть более точными, то автору даётся немного времени на исправление, а если не исправит или выйдет время, то вернём деньги в полном объёме!
Да! На равне с готовыми студенческими работами у нас продаются услуги. Цены на услуги видны сразу, то есть Вам нужно только указать параметры и сразу можно оплачивать.
Отзывы студентов
Ставлю 10/10
Все нравится, очень удобный сайт, помогает в учебе. Кроме этого, можно заработать самому, выставляя готовые учебные материалы на продажу здесь. Рейтинги и отзывы на преподавателей очень помогают сориентироваться в начале нового семестра. Спасибо за такую функцию. Ставлю максимальную оценку.
Лучшая платформа для успешной сдачи сессии
Познакомился со СтудИзбой благодаря своему другу, очень нравится интерфейс, количество доступных файлов, цена, в общем, все прекрасно. Даже сам продаю какие-то свои работы.
Студизба ван лав ❤
Очень офигенный сайт для студентов. Много полезных учебных материалов. Пользуюсь студизбой с октября 2021 года. Серьёзных нареканий нет. Хотелось бы, что бы ввели подписочную модель и сделали материалы дешевле 300 рублей в рамках подписки бесплатными.
Отличный сайт
Лично меня всё устраивает - и покупка, и продажа; и цены, и возможность предпросмотра куска файла, и обилие бесплатных файлов (в подборках по авторам, читай, ВУЗам и факультетам). Есть определённые баги, но всё решаемо, да и администраторы реагируют в течение суток.
Маленький отзыв о большом помощнике!
Студизба спасает в те моменты, когда сроки горят, а работ накопилось достаточно. Довольно удобный сайт с простой навигацией и огромным количеством материалов.
Студ. Изба как крупнейший сборник работ для студентов
Тут дофига бывает всего полезного. Печально, что бывают предметы по которым даже одного бесплатного решения нет, но это скорее вопрос к студентам. В остальном всё здорово.
Спасательный островок
Если уже не успеваешь разобраться или застрял на каком-то задание поможет тебе быстро и недорого решить твою проблему.
Всё и так отлично
Всё очень удобно. Особенно круто, что есть система бонусов и можно выводить остатки денег. Очень много качественных бесплатных файлов.
Отзыв о системе "Студизба"
Отличная платформа для распространения работ, востребованных студентами. Хорошо налаженная и качественная работа сайта, огромная база заданий и аудитория.
Отличный помощник
Отличный сайт с кучей полезных файлов, позволяющий найти много методичек / учебников / отзывов о вузах и преподователях.
Отлично помогает студентам в любой момент для решения трудных и незамедлительных задач
Хотелось бы больше конкретной информации о преподавателях. А так в принципе хороший сайт, всегда им пользуюсь и ни разу не было желания прекратить. Хороший сайт для помощи студентам, удобный и приятный интерфейс. Из недостатков можно выделить только отсутствия небольшого количества файлов.
Спасибо за шикарный сайт
Великолепный сайт на котором студент за не большие деньги может найти помощь с дз, проектами курсовыми, лабораторными, а также узнать отзывы на преподавателей и бесплатно скачать пособия.
Популярные преподаватели
Добавляйте материалы
и зарабатывайте!
Продажи идут автоматически
6381
Авторов
на СтудИзбе
308
Средний доход
с одного платного файла
Обучение Подробнее