Skip to main content

Selecting Ridge Parameters in Infinite Dimensional Hypothesis Spaces

  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 2002 (ICANN 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2415))

Included in the following conference series:

  • 53 Accesses

Abstract

Previously, an unbiased estimator of the generalization error called the subspace information criterion (SIC) was proposed for a finite dimensional reproducing kernel Hilbert space (RKHS). In this paper, we extend SIC so that it can be applied to any RKHSs including infinite dimensional ones. Computer simulations show that the extended SIC works well in ridge parameter selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H. Akaike. A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC-19(6):716–723, 1974.

    Article  MathSciNet  Google Scholar 

  2. H. Akaike. Likelihood and the Bayes procedure. In N. J. Bernardo, et al., editors, Bayesian Statistics, pages 141–166, Valencia, 1980. University Press.

    Google Scholar 

  3. D. L. Donoho and I. M. Johnstone. Ideal spatial adaptation via wavelet shrinkage. Biometrika, 81:425–455, 1994.

    Article  MATH  MathSciNet  Google Scholar 

  4. N. Murata, S. Yoshizawa, and S. Amari. Network information criterion — Determining the number of hidden units for an artificial neural network model. IEEE Transactions on Neural Networks, 5(6):865–872, 1994.

    Article  Google Scholar 

  5. M. Sugiyama and H. Ogawa. Subspace information criterion for model selection. Neural Computation, 13(8):1863–1889, 2001.

    Article  MATH  Google Scholar 

  6. A. Tanaka, H. Imai, and M. Miyakoshi. Choosing the parameter of image restoration filters by modified subspace information criterion. IEICE Transactions on Fundamentals, 2002. to appear.

    Google Scholar 

  7. K. Tsuda, M. Sugiyama, and K.-R. Müller. Subspace information criterion for non-quadratic regularizers — Model selection for sparse regressors. IEEE Transactions on Neural Networks, 13(1), 70–80, 2002.

    Article  Google Scholar 

  8. V. N. Vapnik. Statistical Learning Theory. John Wiley & Sons, New York, 1998.

    MATH  Google Scholar 

  9. H. Wahba. Spline Model for Observational Data. Society for Industrial and Applied Mathematics, Philadelphia and Pennsylvania, 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sugiyama, M., Müller, KR. (2002). Selecting Ridge Parameters in Infinite Dimensional Hypothesis Spaces. In: Dorronsoro, J.R. (eds) Artificial Neural Networks — ICANN 2002. ICANN 2002. Lecture Notes in Computer Science, vol 2415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46084-5_86

Download citation

  • DOI: https://doi.org/10.1007/3-540-46084-5_86

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44074-1

  • Online ISBN: 978-3-540-46084-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics