Skip to main content
Log in

A family of spectral gradient methods for optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai–Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is R-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is R-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Stat. Math. 11(1), 1–16 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  2. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  3. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  4. Birgin, E.G., Martínez, J.M., Raydan, M., et al.: Spectral projected gradient methods: review and perspectives. J. Stat. Softw. 60(3), 539–559 (2014)

    Article  MATH  Google Scholar 

  5. Broyden, C.G.: A class of methods for solving nonlinear simultaneous equations. Math. Comput. 19(92), 577–593 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  6. Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. Comp. Rend. Sci. Paris 25, 536–538 (1847)

    Google Scholar 

  7. Dai, Y.H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dai, Y.H.: A new analysis on the Barzilai–Borwein gradient method. J. Oper. Res. Soc. China 2(1), 187–198 (2013)

    Article  MATH  Google Scholar 

  9. Dai, Y.H., Al-Baali, M., Yang, X.: A positive Barzilai–Borwein-like stepsize and an extension for symmetric linear systems. In: Numerical Analysis and Optimization, pp. 59–75. Springer (2015)

  10. Dai, Y.H., Fletcher, R.: On the asymptotic behaviour of some new gradient methods. Math. Program. 103(3), 541–559 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Fletcher, R.: Projected Barzilai–Borwein methods for large-scale box-constrained quadratic programming. Numer. Math. 100(1), 21–47 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Hager, W.W., Schittkowski, K., Zhang, H.: The cyclic Barzilai–Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 604–627 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.H., Kou, C.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59, 1511–1524 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dai, Y.H., Liao, L.Z.: \(R\)-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  15. Dai, Y.H., Yuan, Y.X.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  16. Dai, Y.H., Yuan, Y.X.: Analysis of monotone gradient methods. J. Ind. Manag. Optim. 1(2), 181 (2005)

    MathSciNet  MATH  Google Scholar 

  17. De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the Yuan steplength. Comput. Optim. Appl. 59(3), 541–563 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  18. De Asmundis, R., Di Serafino, D., Riccio, F., Toraldo, G.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416–1435 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  19. Dennis Jr., J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Rev. 19(1), 46–89 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  20. Di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)

    MathSciNet  MATH  Google Scholar 

  21. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  22. Fletcher, R.: On the Barzilai–Borwein method. In: Optimization and Control with Applications, pp. 235–256 (2005)

  23. Frassoldati, G., Zanni, L., Zanghirati, G.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299 (2008)

    MathSciNet  MATH  Google Scholar 

  24. Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  25. Gonzaga, C.C., Schneider, R.M.: On the steepest descent algorithm for quadratic functions. Comput. Optim. Appl. 63(2), 523–542 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  26. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  27. Huang, Y., Liu, H.: Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization. Comput. Optim. Appl. 65(3), 671–698 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  28. Huang, Y., Liu, H., Zhou, S.: Quadratic regularization projected Barzilai–Borwein method for nonnegative matrix factorization. Data Min. Knowl. Discov. 29(6), 1665–1684 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  29. Jiang, B., Dai, Y.H.: Feasible Barzilai–Borwein-like methods for extreme symmetric eigenvalue problems. Optim. Methods Softw. 28(4), 756–784 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  30. Kalousek, Z.: Steepest descent method with random step lengths. Found. Comput. Math. 17(2), 359–422 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  31. Liu, Y.F., Dai, Y.H., Luo, Z.Q.: Coordinated beamforming for MISO interference channel: complexity analysis and efficient algorithms. IEEE Trans. Signal Process. 59(3), 1142–1157 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  32. Nocedal, J., Sartenaer, A., Zhu, C.: On the behavior of the gradient norm in the steepest descent method. Comput. Optim. Appl. 22(1), 5–35 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  33. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321–326 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  34. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  35. Raydan, M., Svaiter, B.F.: Relaxed steepest descent and Cauchy–Barzilai–Borwein method. Comput. Optim. Appl. 21(2), 155–167 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  36. Tan, C., Ma, S., Dai, Y.H., Qian, Y.: Barzilai–Borwein step size for stochastic gradient descent. In: Advances in Neural Information Processing Systems, pp. 685–693 (2016)

  37. Wang, Y., Ma, S.: Projected Barzilai–Borwein method for large-scale nonnegative image restoration. Inverse Probl. Sci. Eng. 15(6), 559–583 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  38. Wright, S.J., Nowak, R.D., Figueiredo, M.A.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 2479–2493 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  39. Yuan, Y.X.: Step-sizes for the gradient method. AMS IP Stud. Adv. Math. 42(2), 785–796 (2008)

    MathSciNet  MATH  Google Scholar 

  40. Zhou, B., Gao, L., Dai, Y.H.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69–86 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the associate editor and the two anonymous referees whose suggestions and comments greatly improved the quality of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yakui Huang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the National Natural Science Foundation of China (Grant Nos. 11631013, 11701137, 11671116) and the National 973 Program of China (Grant No. 2015CB856002).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dai, YH., Huang, Y. & Liu, XW. A family of spectral gradient methods for optimization. Comput Optim Appl 74, 43–65 (2019). https://doi.org/10.1007/s10589-019-00107-8

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-019-00107-8

Keywords

Navigation