Skip to main content
Log in

Dynamic smoothness parameter for fast gradient methods

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

We present and computationally evaluate a variant of the fast gradient method by Nesterov that is capable of exploiting information, even if approximate, about the optimal value of the problem. This information is available in some applications, among which the computation of bounds for hard integer programs. We show that dynamically changing the smoothness parameter of the algorithm using this information results in a better convergence profile of the algorithm in practice.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Ahookhosh, M., Neumaier, A.: Optimal subgradient algorithms for large-scale convex optimization in simple domains. Numer. Algorithms (2017). doi:10.1007/s11075-017-0297-x

  2. Beck, A., Teboulle, M.: Smoothing and first order methods: a unified framework. SIAM J. Optim. 22(2), 557–580 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bot, R., Hendrich, C.: A variable smoothing algorithm for solving convex optimization problems. TOP 23(1), 124–150 (2015)

  4. Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  5. Chouman, M., Crainic, T., Gendron, B.: Commodity representations and cut-set-based inequalities for multicommodity capacitated fixed-charge network design. Transp. Sci. 51(2), 650–667 (2017)

    Article  Google Scholar 

  6. d’Antonio, G., Frangioni, A.: Convergence analysis of deflected conditional approximate subgradient methods. SIAM J. Optim. 20(1), 357–386 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  7. Frangioni, A.: Generalized bundle methods. SIAM J. Optim. 13(1), 117–156 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  8. Frangioni, A., Gorgone, E.: Generalized bundle methods for sum-functions with “easy” components: applications to multicommodity network design. Math. Program. 145(1), 133–161 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  9. Frangioni, A., Gorgone, E., Gendron, B.: On the computational efficiency of subgradient methods: a case study with lagrangian bounds. Math. Program. Comput. (2017). doi:10.1007/s12532-017-0120-7

  10. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms II—Advanced Theory and Bundle Methods. Grundlehren der mathematischen Wissenschaften, vol. 306. Springer, New York (1993)

    MATH  Google Scholar 

  11. Lan, G., Zhou, Y.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Technical report, University of Florida (2014)

  12. Nesterov, Y.: Primal-dual subgradient methods for convex optimization. SIAM J. Optim. 12, 109–138 (2001)

    Article  MathSciNet  Google Scholar 

  13. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  14. Shor, N.: Minimization Methods for Nondifferentiable Functions. Springer, Berlin (1985)

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Enrico Gorgone.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Frangioni, A., Gendron, B. & Gorgone, E. Dynamic smoothness parameter for fast gradient methods. Optim Lett 12, 43–53 (2018). https://doi.org/10.1007/s11590-017-1168-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-017-1168-z

Keywords

Navigation