skip to main content
10.1145/1390156.1390168acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
research-article

Sparse Bayesian nonparametric regression

Published:05 July 2008Publication History

ABSTRACT

One of the most common problems in machine learning and statistics consists of estimating the mean response from a vector of observations y assuming y = + ε where X is known, β is a vector of parameters of interest and ε a vector of stochastic errors. We are particularly interested here in the case where the dimension K of β is much higher than the dimension of y. We propose some flexible Bayesian models which can yield sparse estimates of β. We show that as K → ∞ these models are closely related to a class of Lévy processes. Simulations demonstrate that our models outperform significantly a range of popular alternatives.

References

  1. Applebaum, D. (2004). Lévy processes and stochastic calculus. Cambridge University Press.Google ScholarGoogle Scholar
  2. Barndorff-Nielsen, O. (1997). Normal inverse Gaussian distributions and stochastic volatility modelling. Scandinavian Journal of Statistics, 24, 1--13.Google ScholarGoogle ScholarCross RefCross Ref
  3. Chen, S., Donoho, D., & Saunders, M. (2001). Atomic decomposition by basis pursuit. SIAM review, 43, 129--159. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96, 1348--1360.Google ScholarGoogle ScholarCross RefCross Ref
  5. Figueiredo, M. (2003). Adaptive sparseness for supervised learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25, 1150--1159. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Ghahramani, Z., Griffiths, T., & Sollich, P. (2006). Bayesian nonparametric latent feature models. Proceedings Valencia/ISBA World meeting on Bayesian statistics.Google ScholarGoogle Scholar
  7. Girolami, M. (2001). A variational method for learning sparse and overcomplete representations. Neural computation, 13, 2517--2532. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Griffin, J., & Brown, P. (2007). Bayesian adaptive lasso with non-convex penalization (Technical Report). Dept of Statistics, University of Warwick.Google ScholarGoogle Scholar
  9. Lewicki, M. S., & Sejnowski, T. (2000). Learning over-complete representations. Neural computation, 12, 337--365. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Madan, D., & Seneta, E. (1990). The variance-gamma model for share market returns. Journal of Business, 63, 511--524.Google ScholarGoogle ScholarCross RefCross Ref
  11. Snoussi, H., & Idier, J. (2006). Bayesian blind separation of generalized hyperbolic processes in noisy and underdeterminate mixtures. IEEE Transactions on Signal Processing, 54, 3257--3269. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Teh, Y., Gorur, D., & Ghahramani, Z. (2007). Stick-breaking construction for the Indian buffet process. International Conference on Artificial Intelligence and Statistics.Google ScholarGoogle Scholar
  13. Thibaux, R., & Jordan, M. (2007). Hierarchical beta processes and the Indian buffet process. International Conference on Artificial Intelligence and Statistics.Google ScholarGoogle Scholar
  14. Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society B, 58, 267--288.Google ScholarGoogle ScholarCross RefCross Ref
  15. Tipping, M. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 211--244, 211--244. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Titsias, M. (2007). The infinite gamma-Poisson feature model. International Conference on Neural Information Processing Systems.Google ScholarGoogle Scholar
  17. Tsilevich, N., Vershik, A., & Yor, M. (2000). Distinguished properties of the gamma process, and related topics (Technical Report). Laboratoire de Probabilités et Modèles aléatoires, Paris.Google ScholarGoogle Scholar
  18. West, M. (2003). Bayesian factor regression models in the "Large p, Small n" paradigm. In J. Bernardo, M. Bayarri, J. Berger, A. Dawid, D. Heckerman, A. Smith and M. West (Eds.), Bayesian statistics 7, 723--732. Oxford University Press.Google ScholarGoogle Scholar

Index Terms

  1. Sparse Bayesian nonparametric regression

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Other conferences
              ICML '08: Proceedings of the 25th international conference on Machine learning
              July 2008
              1310 pages
              ISBN:9781605582054
              DOI:10.1145/1390156

              Copyright © 2008 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 5 July 2008

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate140of548submissions,26%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader