Skip to main content

On the Computational Efficiency of Catalyst Accelerated Coordinate Descent

  • Conference paper
  • First Online:
Mathematical Optimization Theory and Operations Research (MOTOR 2021)

Abstract

This article is devoted to one particular case of using universal accelerated proximal envelopes to obtain computationally efficient accelerated versions of methods used to solve various optimization problem setups. We propose a proximally accelerated coordinate descent method that achieves the efficient algorithmic complexity of iteration and allows taking advantage of the data sparseness. It was considered an example of applying the proposed approach to optimizing a SoftMax-like function, for which the described method allowing weaken the dependence of the computational complexity on the dimension n in \(\mathcal {O}(\sqrt{n})\) times and, in practice, demonstrates a faster convergence in comparison with standard methods. As an example of applying the proposed approach, it was shown a variant of obtaining on its basis some efficient methods for optimizing Markov Decision Processes (MDP) in a minimax formulation with a Nesterov smoothed target function.

D.A. Pasechnyuk’s research was supported by the A.M. Raigorodsky Scholarship in the field of optimization and RFBR grant 19-31-51001 (Scientific mentoring). The work of V.V. Matyukhin was supported by the Ministry of Science and Higher Education of the Russian Federation (state assignment) No. 075-00337-20-03, project number 0714-2020-0005.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For the detailed proofs see appendices in full paper version on https://arxiv.org/abs/2103.06688.

References

  1. Anikin, A., et al.: Modern efficient numerical approaches to regularized regression problems in application to traffic demands matrix calculation from link loads. In: Proceedings of International conference ITAS-2015. Russia, Sochi (2015)

    Google Scholar 

  2. Bertsekas, D.P.: Dynamic Programming and Optimal Control. Athena Scientific, Belmont, MA (1995)

    MATH  Google Scholar 

  3. Blanchard, P., Higham, D.J., Higham, N.J.: Accurately Computing the Log-Sum-Exp and Softmax Functions (2019)

    Google Scholar 

  4. Bubeck, S.: Convex optimization: algorithms and complexity. arXiv preprint arXiv:1405.4980 (2014)

  5. Bubeck, S., Jiang, Q., Lee, Y.T., Li, Y., Sidford, A.: Near-optimal method for highly smooth convex optimization. In: Conference on Learning Theory, pp. 492–507. PMLR (2019)

    Google Scholar 

  6. Carmon, Y., Jin, Y., Sidford, A., Tian, K.: Variance reduction for matrix games. In: Advances in Neural Information Processing Systems, pp. 11381–11392 (2019)

    Google Scholar 

  7. Carmon, Y., Jin, Y., Sidford, A., Tian, K.: Coordinate methods for matrix games. arXiv preprint arXiv:2009.08447 (2020)

  8. Chernov, A.: Direct-dual method for solving the entropy-linear programming problem. Intell. Syst. Theor Appl. 20(1), 39–59 (2016)

    MathSciNet  Google Scholar 

  9. d’Aspremont, A., Scieur, D., Taylor, A.: Acceleration methods (2021)

    Google Scholar 

  10. Doikov, N., Nesterov, Y.: Contracting proximal methods for smooth convex optimization. SIAM J. Optim. 30(4), 3146–3169 (2020)

    Article  MathSciNet  Google Scholar 

  11. Dvinskikh, D., et al.: Accelerated meta-algorithm for convex optimization. arXiv preprint arXiv:2004.08691 (2020)

  12. Gasnikov, A., Gasnikova, E., Nesterov, Y., Chernov, A.: Efficient numerical methods for entropy-linear programming problems. Comput. Math. Math. Phys. 56(4), 523–534 (2016)

    Google Scholar 

  13. Gasnikov, A.: Universal gradient descent. arXiv preprint arXiv:1711.00394 (2017)

  14. Ivanova, A., Pasechnyuk, D., Grishchenko, D., Shulgin, E., Gasnikov, A., Matyukhin, V.: Adaptive catalyst for smooth convex optimization. arXiv preprint arXiv:1911.11271 (2019)

  15. Jin, Y., Sidford, A.: Efficiently solving MDPs with stochastic mirror descent. In: International Conference on Machine Learning, pp. 4890–4900. PMLR (2020)

    Google Scholar 

  16. Kamzolov, D., Gasnikov, A., Dvurechensky, P.: On the optimal combination of tensor optimization methods. arXiv preprint arXiv:2002.01004 (2020)

  17. Kulunchakov, A., Mairal, J.: A generic acceleration framework for stochastic composite optimization. In: Advances in Neural Information Processing Systems. pp. 12556–12567 (2019)

    Google Scholar 

  18. Lin, H., Mairal, J., Harchaoui, Z.: A universal catalyst for first-order optimization. Adv. Neural Inf. Process. Syst. 28, 3384–3392 (2015)

    Google Scholar 

  19. Lin, H., Mairal, J., Harchaoui, Z.: Catalyst acceleration for first-order convex optimization: from theory to practice. J. Mach. Learn. Res. 18(1), 7854–7907 (2017)

    MathSciNet  MATH  Google Scholar 

  20. Monteiro, R.D., Svaiter, B.F.: An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods. SIAM J. Optim. 23(2), 1092–1125 (2013)

    Article  MathSciNet  Google Scholar 

  21. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)

    Article  MathSciNet  Google Scholar 

  22. Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)

    Article  MathSciNet  Google Scholar 

  23. Nesterov, Y.: Lectures on Convex Optimization. SOIA, vol. 137. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91578-4

    Book  MATH  Google Scholar 

  24. Nesterov, Y., Stich, S.U.: Efficiency of the accelerated coordinate descent method on structured optimization problems. SIAM J. Optim. 27(1), 110–123 (2017)

    Article  MathSciNet  Google Scholar 

  25. Paquette, C., Lin, H., Drusvyatskiy, D., Mairal, J., Harchaoui, Z.: Catalyst acceleration for gradient-based non-convex optimization. arXiv preprint arXiv:1703.10993 (2017)

  26. Parikh, N., Boyd, S.: Proximal algorithms. Found. Trends Optim 1(3), 127–239 (2014)

    Article  Google Scholar 

  27. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM j. Control Optim. 14(5), 877–898 (1976)

    Article  MathSciNet  Google Scholar 

  28. SciPy.org: Python Scipy documentation: scipy.sparse.csr\_matrix (2020) https://docs.scipy.org/doc/scipy/reference/generated/scipy.sparse.csr_matrix.html. Accessed 5 Jan 2021

  29. Sidford, A., Wang, M., Wu, X., Ye, Y.: Variance reduced value iteration and faster algorithms for solving Markov decision processes. In: Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms. pp. 770–787. SIAM (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmitry Pasechnyuk .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pasechnyuk, D., Matyukhin, V. (2021). On the Computational Efficiency of Catalyst Accelerated Coordinate Descent. In: Pardalos, P., Khachay, M., Kazakov, A. (eds) Mathematical Optimization Theory and Operations Research. MOTOR 2021. Lecture Notes in Computer Science(), vol 12755. Springer, Cham. https://doi.org/10.1007/978-3-030-77876-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77876-7_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77875-0

  • Online ISBN: 978-3-030-77876-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics