c9-7 (779537), страница 4
Текст из файла (страница 4)
Further reproduction, or any copying of machinereadable files (including this one) to any servercomputer, is strictly prohibited. To order Numerical Recipes books,diskettes, or CDROMsvisit website http://www.nr.com or call 1-800-872-7423 (North America only),or send email to trade@cup.cam.ac.uk (outside North America).}}for (i=1;i<=n;i++) {Compute ∇f ≈ (Q · R)T · F for the line search.for (sum=0.0,j=1;j<=n;j++) sum += qt[i][j]*fvec[j];g[i]=sum;}for (i=n;i>=1;i--) {for (sum=0.0,j=1;j<=i;j++) sum += r[j][i]*g[j];g[i]=sum;}for (i=1;i<=n;i++) {Store x and F.xold[i]=x[i];fvcold[i]=fvec[i];}fold=f;Store f .for (i=1;i<=n;i++) {Right-hand side for linear equations is −QT · F.for (sum=0.0,j=1;j<=n;j++) sum += qt[i][j]*fvec[j];p[i] = -sum;}rsolv(r,n,d,p);Solve linear equations.lnsrch(n,xold,fold,g,p,x,&f,stpmax,check,fmin);lnsrch returns new x and f .
It also calculates fvec at the new x when it calls fmin.test=0.0;Test for convergence on function values.for (i=1;i<=n;i++)if (fabs(fvec[i]) > test) test=fabs(fvec[i]);if (test < TOLF) {*check=0;FREERETURN}if (*check) {True if line search failed to find a new x.if (restrt) FREERETURNFailure; already tried reinitializing the Jacoelse {bian.test=0.0;Check for gradient of f zero, i.e., spuriousden=FMAX(f,0.5*n);convergence.for (i=1;i<=n;i++) {temp=fabs(g[i])*FMAX(fabs(x[i]),1.0)/den;if (temp > test) test=temp;}if (test < TOLMIN) FREERETURNelse restrt=1;Try reinitializing the Jacobian.}} else {Successful step; will use Broyden update forrestrt=0;next step.test=0.0;Test for convergence on δx.for (i=1;i<=n;i++) {temp=(fabs(x[i]-xold[i]))/FMAX(fabs(x[i]),1.0);if (temp > test) test=temp;}if (test < TOLX) FREERETURN}9.7 Globally Convergent Methods for Nonlinear Systems of Equations393More Advanced ImplementationsCITED REFERENCES AND FURTHER READING:Dennis, J.E., and Schnabel, R.B.
1983, Numerical Methods for Unconstrained Optimization andNonlinear Equations (Englewood Cliffs, NJ: Prentice-Hall). [1]Broyden, C.G. 1965, Mathematics of Computation, vol. 19, pp. 577–593. [2]Sample page from NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING (ISBN 0-521-43108-5)Copyright (C) 1988-1992 by Cambridge University Press.Programs Copyright (C) 1988-1992 by Numerical Recipes Software.Permission is granted for internet users to make one paper copy for their own personal use. Further reproduction, or any copying of machinereadable files (including this one) to any servercomputer, is strictly prohibited. To order Numerical Recipes books,diskettes, or CDROMsvisit website http://www.nr.com or call 1-800-872-7423 (North America only),or send email to trade@cup.cam.ac.uk (outside North America).One of the principal ways that the methods described so far can fail is if J (in Newton’smethod) or B in (Broyden’s method) becomes singular or nearly singular, so that δx cannotbe determined.
If you are lucky, this situation will not occur very often in practice. Methodsdeveloped so far to deal with this problem involve monitoring the condition number of J andperturbing J if singularity or near singularity is detected. This is most easily implementedif the QR decomposition is used instead of LU in Newton’s method (see [1] for details).Our personal experience is that, while such an algorithm can solve problems where J isexactly singular and the standard Newton’s method fails, it is occasionally less robust onother problems where LU decomposition succeeds. Clearly implementation details involvingroundoff, underflow, etc., are important here and the last word is yet to be written.Our global strategies both for minimization and zero finding have been based on linesearches.
Other global algorithms, such as the hook step and dogleg step methods, are basedinstead on the model-trust region approach, which is related to the Levenberg-Marquardtalgorithm for nonlinear least-squares (§15.5). While somewhat more complicated than linesearches, these methods have a reputation for robustness even when starting far from thedesired zero or minimum [1]..















