Skip to main content
Log in

Simple bounds for recovering low-complexity models

  • Short Communication
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

This note presents a unified analysis of the recovery of simple objects from random linear measurements. When the linear functionals are Gaussian, we show that an s-sparse vector in \({\mathbb{R}^n}\) can be efficiently recovered from 2s log n measurements with high probability and a rank r, n × n matrix can be efficiently recovered from r(6n − 5r) measurements with high probability. For sparse vectors, this is within an additive factor of the best known nonasymptotic bounds. For low-rank matrices, this matches the best known bounds. We present a parallel analysis for block-sparse vectors obtaining similarly tight bounds. In the case of sparse and block-sparse signals, we additionally demonstrate that our bounds are only slightly weakened when the measurement map is a random sign matrix. Our results are based on analyzing a particular dual point which certifies optimality conditions of the respective convex programming problem. Our calculations rely only on standard large deviation inequalities and our analysis is self-contained.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. Achlioptas, D.: Database-friendly random projections: Johnson-Lindenstrauss with binary coins. J. Comput. Syst. Sci. 66(4), 671–687 (2003). Special issue of invited papers from PODS’01

    Google Scholar 

  2. Candès E., Recht B.: Exact matrix completion via convex optimization. Found. Comut. Math. 9(6), 717–772 (2009)

    Article  MATH  Google Scholar 

  3. Candès E.J., Plan Y.: Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inf. Theory 57(4), 2342–2359 (2011)

    Article  Google Scholar 

  4. Candès E.J., Romberg J., Tao T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489–509 (2006)

    Article  MATH  Google Scholar 

  5. Chandrasekaran, V., Recht, B., Parrilo, P.A., Willsky, A.: The convex geometry of linear inverse problems. Submitted for publication. Preprint available at arxiv.org/1012.0621 (2010)

  6. Davidson K.R., Szarek S.J.: Local operator theory, random matrices and Banach spaces. In: Johnson, W.B., Lindenstrauss, J. (eds.) Handbook on the Geometry of Banach Spaces, pp. 317–366. Elsevier, Amsterdam (2001)

    Chapter  Google Scholar 

  7. Donoho D., Tanner J.: Counting faces of randomly-projected polytopes when the projection radically lowers dimension. J. Am. Math. Soc. 22(1), 1–53 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  8. Donoho D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    Article  MathSciNet  Google Scholar 

  9. Eldar, Y.C., Bolcskei, H.: Block-sparsity: coherence and efficient recovery. In: ICASSP, The International Conference on Acoustics, Signal and Speech Processing (2009)

  10. Fuchs J.J.: On sparse representations in arbitrary redundant bases. IEEE Trans. Inf. Theory 50, 1341–1344 (2004)

    Article  Google Scholar 

  11. Gordon Y.: On Milman’s inequality and random subspaces which escape through a mesh in \({\mathbb{R}^n}\). In: Lindenstrauss, J., Milman, V.D. (eds.) Geometric Aspects of Functional Analysis, Lecture Notes in Mathematics, vol. 1317, pp. 84–106. Springer, Berlin (1988)

    Google Scholar 

  12. Hoeffding W.: Probability inequalities for sums of bounded random variables. J. Am. Stat. Assoc. 58(301), 13–30 (1963)

    Article  MathSciNet  MATH  Google Scholar 

  13. Laurent B., Massart P.: Adaptive estimation of a quadratic functional by model selection. Ann. Stat. 28(5), 1302–1338 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  14. Mardia K.V., Kent J.T., Bibby J.M.: Multivariate Analysis. Academic Press, London (1979)

    MATH  Google Scholar 

  15. Negahban, S., Ravikumar, P., Wainwright, M.J., Yu, B.: A unified framework for high-dimensional analysis of m-estimators with decomposable regularizers. In: Advances in Neural Information Processing Systems, conference proceedings (2009)

  16. Oymak, S., Hassibi, B.: New null space results and recovery thresholds for matrix rank minimization. Submitted for publication. Preprint available at arxiv.org/abs/1011.6326 (2010)

  17. Parvaresh, F., Hassibi, B.: Explicit measurements with almost optimal thresholds for compressed sensing. In: ICASSP, The International Conference on Acoustics, Signal and Speech Processing, C (2008)

  18. Rao, N., Recht, B., Nowak,R.: Universal measurement bounds for structured sparse signal recovery. In: Proceedings of AISTATS (2012)

  19. Recht B., Fazel M., Parrilo P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  20. Stojnic, M.: Various thresholds for 1-optimization in compressed sensing. Preprint available at arxiv.org/abs/0907.3666 (2009)

  21. Vershynin, R.: Introduction to the non-asymptotic analysis of random matrices. In: Eldar, Y.C., Kutyniok, G. (eds.) Compressed Sensing: Theory and Applications. Cambridge University Press, Cambridge. To appear. Preprint available at http://www-personal.umich.edu/~romanv/papers/papers.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Recht.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Candès, E., Recht, B. Simple bounds for recovering low-complexity models. Math. Program. 141, 577–589 (2013). https://doi.org/10.1007/s10107-012-0540-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-012-0540-0

Keywords

Mathematics Subject Classification

Navigation