Abstract
We present a feasible directions algorithm, based on Lagrangian concepts, for the solution of the nonlinear programming problem with equality and inequality constraints. At each iteration a descent direction is defined; by modifying it, we obtain a feasible descent direction. The line search procedure assures the global convergence of the method and the feasibility of all the iterates.
We prove the global convergence of the algorithm and apply it to the solution of some test problems. Although the present version of the algorithm does not include any second-order information, like quasi-Newton methods, these numerical results exhibit a behavior comparable to that of the best methods known at present for nonlinear programming.
Similar content being viewed by others
References
J. Abadie, “Méthode du gradient réduit généralisé: Le code GRGA”, Note HI 1756/00, Electricité de France (Paris, 1975).
J. Abadie and J. Carpentier, “Generalization of the Wolfe reduced gradient method to the case of nonlinear constraints”, in: R. Fletcher, ed.,Optimization (Academic Press, New York, 1969) pp. 37–49.
D.P. Bertsekas, “On penalty and multiplier methods for constrained minimization”,SIAM Journal on Control and Optimization 14 (1978) 216–235.
M.C. Biggs, “Constrained minimization using recursive equality quadratic programming”, in: F.A. Lootsma, ed.,Numerical Methods for Nonlinear Optimization (Academic Press, London, New York, 1971) pp. 411–428.
M.C. Biggs, “On the convergence of some constrained minimization algorithms based on recursive quadratic programming”,Journal of the Institute of Mathematics and Its Applications 21 (1978) 67–82.
M.C. Bartholomew-Biggs, “An improved implementation of the recursive quadratic programming method for constrained minimization”, Technical Report 105, Numerical Optimisation Centre, The Hatfield Polytechnic, (Hatfield England, 1979).
Bui-Trong-Lieu and P. Huard, “La méthode des centres dans un espace topologique”,Numerische Mathematik 8 (1966) 65–67.
C. Charambolus, “Nonlinear least path optimization and nonlinear programming”,Mathematical Programming 12 (1977) 195–225.
R.A. Colville, “A comparative study on nonlinear programming codes”, Report 320-2949, IBM Scientific Center (New York, 1968).
R. Fletcher, “Methods for nonlinear constraints”, in: M.J.D. Powell, ed.,Nonlinear Optimization 1981, NATO Conference series (Academic Press, London, New York, 1982) pp. 185–211.
D. Gabay, “Reduced quasi-Newton methods with feasibility improvement for nonlinearly constrained optimization”,Mathematical Programming Study 16 (1982) 18–44
S.P. Han, “Superlinearly convergent variable metric algorithms for general nonlinear programming problems”,Mathematical Programming 11 (1976) 263–282.
S.P. Han, “A globally convergent method for nonlinear programming”,Journal of Optimization Theory and Applications 22 (1977) 297–309.
J. Herskovits and N. Zouain, “A nonlinear programming algorithm for structural optimization problems”, Report 7-79, Programa de Engenharia Mecânica, COPPE, Universidade Federal do Rio de Janeiro, Rio de Janeiro, 1979.
W. Hock and K. Schittkowski, “Test examples for nonlinear programming codes”, Lecture Notes in Economics and Mathematical Systems 187 (Springer-Verlag, Berlin, 1981).
P. Huard, “Programation mathématique convèxe”,Revue Française d’Informatique et Recherche Opérationnelle 7 (1968) 43–59.
D.G. Luenberger,Introduction to Linear and Nonlinear Programming (Addison-Wesley, Reading, MA, 1973).
D.Q. Mayne and E. Polak, “Feasible direction algorithms for optimization problems with equality and inequality constraints”,Mathematical Programming 11 (1976) 67–80.
O. Pironneau and E. Polak, “A dual method for optimal control problems”,SIAM Journal on Control 11 (1973) 534–549.
E. Polak,Computational Methods in Optimization (Academic Press, New York, 1971).
M.J.D. Powell, “A fast algorithm for nonlinearly constrained optimization calculations”, in: G.A. Watson, ed.,Numerical Analysis, Dundee, 1977, Lecture Notes in Mathematics 630 (Springer-Verlag, Berlin, 1978) pp. 144–157.
M.J.D. Powell, “The convergence of varible metric methods for nonlinearly constrained optimization calculation”, in: O.L. Mangasarian, R.R Meyer and S.M. Robinson, eds.,Nonlinear Programming 3 (Academic Press, New York, 1978) pp. 27–63.
M.J.D. Powell, “Algorithms for nonlinear constraints that use Lagrangian functions”,Mathematical Programming 14 (1978) 224–248.
R.W.H. Sargent, “Reduced-gradient and projection methods for nonlinear programming”, in: P.E. Gill and W. Murray, eds.,Numerical Methods for Constrained Optimization (Academic Press, New York, 1974) pp. 149–174.
S. Segenreich, N. Zouain and J. Herskovits, “An optimality criteria method based on slack variables concept for large structural optimization”, in:Proceedings of the Symposium on Applications of Computer Methods in Engineering (Los Angeles, USA, 1977) pp. 563–572.
R.A. Tapia, “Diagonalized multiplier methods and quasi-Newton methods for constrained optimization”,Journal of Optimization Theory and Applications 22 (1977) 135–194.
D.M. Topkis and A.R. Veinott Jr., “On the convergence of some feasible direction algorithms for nonlinear programming”,Journal on SIAM Control 5 (1967) 268–279.
G. Zoutendijk,Methods of Feasible Directions (Elsevier, Amsterdam, 1960).
Author information
Authors and Affiliations
Additional information
Research performed while the author was on a two years appointment at INRIA, Rocquencourt, France, and partially supported by the Brazilian Research Council (CNPq).
Rights and permissions
About this article
Cite this article
Herskovits, J. A two-stage feasible directions algorithm for nonlinear constrained optimization. Mathematical Programming 36, 19–38 (1986). https://doi.org/10.1007/BF02591987
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF02591987