Abstract
In this paper, we suggest approximations for smoothing out the kinks caused by the presence of “max” or “min” operators in many non-smooth optimization problems. We concentrate on the continuous-discrete min—max optimization problem. The new approximations replace the original problem in some neighborhoods of the kink points. These neighborhoods can be made arbitrarily small, thus leaving the original objective function unchanged at almost every point ofR n. Furthermore, the maximal possible difference between the optimal values of the approximate problem and the original one, is determined a priori by fixing the value of a single parameter. The approximations introduced preserve properties such as convexity and continuous differentiability provided that each function composing the original problem has the same properties. This enables the use of efficient gradient techniques in the solution process. Some numerical examples are presented.
Similar content being viewed by others
References
D.P. Bertsekas, “Nondifferentiable optimization via approximation”,Mathematical Programming Study 3 (1975) 1–25.
C. Charalambous and J.W. Bandler, “Non-linear minimax optimization as a sequence of leastpth optimization with finite values ofp”,International Journal of Systems Science 7 (1976) 377–391.
A.R. Conn, “Constrained optimization using a nondifferentiable penalty function”,SIAM Journal on Numerical Analysis 10 (1973) 760–784.
A.M. Geoffrion, “Objective function approximations in mathematical programming”,Mathematical Programming 13 (1977) 23–37.
S.E. Hersom, “Smoothing for piece-wise linear functions”, Technical Report No. 71, Numerical Optimisation Centre, The Hatfield Polytechnic (Hatfield, 1975).
K. Madsen, “An algorithm for minimax solution of overdetermined systems of non-linear equations”,Journal of the Institute of Mathematics and its Applications 16 (1975) 321–328.
K. Madsen and H. Schaer-Jacobsen, “Linearly constrained minimax optimization”,Mathematical Programming 14 (1978) 208–223.
M.J.D. Powell, “An efficient method for finding the minimum of a function of several variables without calculating derivatives”,The Computer Journal 7 (1964) 155–162.
G.W. Stewart, “A modification of Davidon's minimization method to accept difference approximations of derivatives”,Journal of the Association for Computing Machinery 14 (1967) 72–83.
A. Tishler and I. Zang, “A new maximum likelihood method for piecewise regression”, Working Paper No. 526/77/R, Faculty of Management, Tel Aviv University (Tel Aviv, August 1977 (Revised July 1979)).
A. Tishler and I. Zang, “An absolute deviations curve fitting algorithm for non-linear models”, Working Paper No. 577/78, Faculty of Management, Tel Aviv University (Tel Aviv, October 1978 (Revised August 1979)).
I. Zang, “A new arc algorithm for unconstrained optimization”,Mathematical Programming 15 (1978) 36–52.
W.I. Zangwill, “Non-linear programming via penalty functions”,Management Science 13 (1967) 344–358.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Zang, I. A smoothing-out technique for min—max optimization. Mathematical Programming 19, 61–77 (1980). https://doi.org/10.1007/BF01581628
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01581628