Abstract
In this paper we give two derivative-free computational algorithms for nonlinear least squares approximation. The algorithms are finite difference analogues of the Levenberg-Marquardt and Gauss methods. Local convergence theorems for the algorithms are proven. In the special case when the residuals are zero at the minimum, we show that certain computationally simple choices of the parameters lead to quadratic convergence. Numerical examples are included.
Similar content being viewed by others
References
Bard, Y.: Comparison of gradient methods for the solution of nonlinear parameter estimation problems. SIAM J. Numer. Anal.7, 157–186 (1970).
Barnes, J. G. P.: An algorithm for solving nonlinear equations based on the secant method. The Computer Journal8, 66–72 (1965).
Boggs, P. T., Dennis, J. E., Jr.: Function minimization by descent along a difference approximation to the gradient. To appear.
Box, M. J.: A comparison of several current optimization methods, and the use of transformations in constrained problems. The Computer Journal9, 67–77 (1966).
Broyden, C. G.: A class of methods for solving nonlinear simultaneous equations. Math. Comp.19, 577–593 (1965).
Davidon, W. C.: Variable metric methods for minimization. A. E. C. Research and Development Report, ANL-5990. (1959) (Rev.).
Dennis, J. E., Jr.: On the convergence of Newtonlike methods. To appear in: Numerical methods for nonlinear algebraic equations, ed. P. Rabinowitz. London: Gordon and Breach 1970.
Fletcher, R., Powell, M. J. D.: A rapidly convergent descent method for minimization. The Computer Journal6, 163–168 (1963).
— Reeves, C. M.: Function minimization by conjugate gradients. The Computer Journal7, 149–154 (1964).
Greenstadt, J.: Variations of variable-metric methods. Math. Comp.24, 1–22 (1970).
Levenberg, K.: A method for the solution of certain nonlinear problems in least squares. Quart. Appl. Math.2, 164–168 (1944).
Marquardt, D. W.: An algorithm for least squares estimation of nonlinear parameters. SIAM J. Appl. Math.11, 431–441 (1963).
Nelder, J. A., Mead, R.: A simplex method for function minimization. The Computer Journal7, 308–313 (1965).
Ortega, J. M., Rheinboldt, W. C.: Iterative solution of nonlinear equations in several variables. Chapt. 8. New York: Academic Press 1970
Powell, M. J. D.: An efficient method of finding the minimum of a function of several variables without calculating derivatives. The Computer Journal7, 155–162 (1964).
— A method for minimizing a sum of squares of nonlinear functions without calculating derivatives. The Computer Journal7, 303–307 (1965).
Powell, M. J. D. A hybrid method for nonlinear equations. U.K.A.E.R.E. Technical Paper No. 364, January. 1969.
Rosenbrock, H. H.: An automatic method for finding the greatest or least value of a function. The Computer Journal3, 175–184 (1960).
Spendley, W., Hext, G. R., Himsworth, F. R.: Sequential applications of simplex designs in optimisation and evolutionary operation. Technometrics4, 441–461 (1962).
Swann, W. H.: Report on the development of a new direct searching method of optimisation. I.C.I.Ltd., Central Instrument Laboratory Research Note 64/3 (1964).
Traub, J. F.: Iterative methods for the solution of equations. Englewood Cliffs: Prentice-Hall 1964.
Wilkinson, J. H.: The algebraic eigenvalue problem, p. 54. Clarendon: Oxford 1965.
Author information
Authors and Affiliations
Additional information
On leave 1970–71 at Yale University
The work of this author was supported in part by the National Science Foundation under Grant GJ-844.
Rights and permissions
About this article
Cite this article
Brown, K.M., Dennis, J.E. Derivative free analogues of the Levenberg-Marquardt and Gauss algorithms for nonlinear least squares approximation. Numer. Math. 18, 289–297 (1971). https://doi.org/10.1007/BF01404679
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF01404679