Skip to main content
Log in

Two new unconstrained optimization algorithms which use function and gradient values

  • Contributed Papers
  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Two new methods for unconstrained optimization are presented. Both methods employ a hybrid direction strategy which is a modification of Powell's 1970 dogleg strategy. They also employ a projection technique introduced by Davidon in his 1975 algorithm which uses projection images of Δx and Δg in updating the approximate Hessian. The first method uses Davidon's optimally conditioned update formula, while the second uses only the BFGS update. Both methods performed well without Powell's special iterations and singularity safeguards, and the numerical results are very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Powell, M. J. D.,A Hybrid Method for Nonlinear Equations, Numerical Methods for Nonlinear Algebraic Equations, Edited by P. Rabinowtiz, Gordon and Breach, London, England, 1970.

    Google Scholar 

  2. Powell, M. J. D.,A Fortran Subroutine for Solving Systems of Nonlinear Algebraic Equations, Numerical Methods for Nonlinear Algebraic Equations, Edited by P. Rabinowitz, Gordon and Breach, London, England, 1970.

    Google Scholar 

  3. Powell, M. J. D.,A Fortran Subroutine for Unconstrained Minimization, Requiring First Derivatives of the Objective Function, Atomic Energy Research Establishment, Harwell, England, Report No. R-6469, 1970.

    Google Scholar 

  4. Powell, M. J. D.,A New Algorithm for Unconstrained Optimization, Nonlinear Programming, Edited by J. B. Rosen, O. L. Mangasarian, and K. Ritter, Academic Press, New York, New York, 1970.

    Google Scholar 

  5. Davidon, W. C.,New Optimization Algorithms Without Linear Searches, Mathematical Programming, Vol. 9, pp. 1–30, 1975.

    Google Scholar 

  6. Fletcher, R.,A New Approach to Variable Metric Algorithms, Computing Journal, Vol. 13, pp. 317–322, 1970.

    Google Scholar 

  7. Gill, P. E., Murray, W., andPitfield, R. A.,The Implementation of Two Revised Quasi-Newton Algorithms for Unconstrained Optimization, National Physical Laboratory, Teddington, England, Report No. NAC-11, 1972.

    Google Scholar 

  8. Dixon, L. C. W.,All the Quasi-Newton Family Generate Identical Points, Journal of Optimization Theory and Applications, Vol. 10, pp. 34–40, 1972.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Communicated by H. Y. Huang

This research was supported by the National Science Foundation under Grant No. GJ-40903.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dennis, J.E., Mei, H.H.W. Two new unconstrained optimization algorithms which use function and gradient values. J Optim Theory Appl 28, 453–482 (1979). https://doi.org/10.1007/BF00932218

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00932218

Key Words

Navigation