Skip to main content
Log in

On the complexity analysis of randomized block-coordinate descent methods

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

In this paper we analyze the randomized block-coordinate descent (RBCD) methods proposed in Nesterov (SIAM J Optim 22(2):341–362, 2012), Richtárik and Takáč (Math Program 144(1–2):1–38, 2014) for minimizing the sum of a smooth convex function and a block-separable convex function, and derive improved bounds on their convergence rates. In particular, we extend Nesterov’s technique developed in Nesterov (SIAM J Optim 22(2):341–362, 2012) for analyzing the RBCD method for minimizing a smooth convex function over a block-separable closed convex set to the aforementioned more general problem and obtain a sharper expected-value type of convergence rate than the one implied in Richtárik and Takáč (Math Program 144(1–2):1–38, 2014). As a result, we also obtain a better high-probability type of iteration complexity. In addition, for unconstrained smooth convex minimization, we develop a new technique called randomized estimate sequence to analyze the accelerated RBCD method proposed by Nesterov (SIAM J Optim 22(2):341–362, 2012) and establish a sharper expected-value type of convergence rate than the one given in Nesterov (SIAM J Optim 22(2):341–362, 2012).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Beck, A., Tetruashvili, L.: On the convergence of block coordinate descent type methods. SIAM J. Optim. 23(4), 2037–2060 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  2. Chang, K.-W., Hsieh, C.-J., Lin, C.-J.: Coordinate descent method for large-scale \(l_2\)-loss linear support vector machines. J. Mach. Learn. Res. 9, 1369–1398 (2008)

    MathSciNet  MATH  Google Scholar 

  3. Hong, M., Wang, X., Razaviyayn, M., Luo, Z.-Q.L.: Iteration complexity analysis of block coordinate descent methods. arXiv:1310.6957 (2013)

  4. Hsieh, C.-J., Chang, K.-W., Lin, C.-J., Keerthi, S., Sundararajan, S.: A dual coordinate descent method for large-scale linear SVM. In Proceedings of the 25th International Conference on Machine Learning (ICML), pp. 408–415 (2008)

  5. Leventhal, D., Lewis, A.S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35(3), 641–654 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Li, Y., Osher, S.: Coordinate descent optimization for \(l_1\) minimization with application to compressed sensing; a greedy algorithm. Inverse Probl. Imaging 3, 487–503 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  7. Lu, Z., Xiao, L.: Randomized block coordinate non-monotone gradient method for a class of nonlinear programming. arXiv:1306.5918 (2013)

  8. Luo, Z.Q., Tseng, P.: On the convergence of the coordinate descent method for convex differentiable minimization. J. Optim. Theory Appl. 72(1), 7–35 (2002)

    Article  MathSciNet  Google Scholar 

  9. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer Academic Publishers, Boston (2004)

    Book  Google Scholar 

  10. Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. 140(1), 125–161 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  11. Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  12. Patrascu, A., Necoara, I.: Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization. To appear in Journal of Global Optimization. arXiv:1305.4027 (2013)

  13. Qin, Z., Scheinberg, K., Goldfarb, D.: Efficient block-coordinate descent algorithms for the group lasso. Math. Program. Comput. 5(2), 143–169 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  14. Richtárik, P., Takáč, M.: Efficient serial and parallel coordinate descent method for huge-scale truss topology design. Oper. Res. Proc. pp. 27–32 (2012)

  15. Richtárik, P., Takáč, M.: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Math. Program. 144(1–2), 1–38 (2014)

    Article  MathSciNet  Google Scholar 

  16. Richtárik, P., Takáč, M.: Parallel coordinate descent methods for big data optimization. arXiv:1212.0873 (2012)

  17. Saha, A., Tewari, A.: On the non-asymptotic convergence of cyclic coordinate descent methods. SIAM J. Optim. 23(1), 576–601 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Shalev-Shwartz, S., Tewari, A.: Stochastic methods for \(l_1\) regularized loss minimization. In Proceedings of the 26th International Conference on Machine Learning (ICML), pp. 929–936 (2009)

  19. Shalev-Shwartz, S., Zhang, T.: Proximal stochastic dual coordinate ascent. arXiv:1211.2717 (2012)

  20. Tappenden, R., Richtárik, P., Gondzio, J.: Inexact coordinate descent: complexity and preconditioning. arXiv:1304.5530 (2013)

  21. Tseng, P.: Convergence of a block coordinate descent method for nondifferentiable minimization. J. Optim. Theory Appl. 109, 475–494 (2001)

  22. Tseng, P., Yun, S.: Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization. J. Optim. Theory Appl. 140, 513–535 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  23. Tseng, P., Yun, S.: A coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  24. Wen, Z., Goldfarb, D., Scheinberg, K.: Block coordinate descent methods for semidefinite programming. In: Anjos, M.F., Lasserre, J.B. (eds.) Handbook on Semidefinite, Cone and Polynomial Optimization. Software and Applications, vol. 166, pp. 533–564. Springer, Berlin (2012)

    Chapter  Google Scholar 

  25. Wright, S.J.: Accelerated block-coordinate relaxation for regularized optimization. SIAM J. Optim. 22, 159–186 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  26. Wu, T., Lange, K.: Coordinate descent algorithms for lasso penalized regression. Ann. Appl. Stat. 2(1), 224–244 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  27. Yun, S., Toh, K.-C.: A coordinate gradient descent method for \(l_1\)-regularized convex minimization. Comput. Optim. Appl. 48, 273–307 (2011)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the two anonymous referees for their constructive comments which substantially improved the presentation of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhaosong Lu.

Additional information

Zhaosong Lu: This author was supported in part by NSERC Discovery Grant.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lu, Z., Xiao, L. On the complexity analysis of randomized block-coordinate descent methods. Math. Program. 152, 615–642 (2015). https://doi.org/10.1007/s10107-014-0800-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-014-0800-2

Keywords

Mathematics Subject Classification (2000)

Navigation