Abstract
Feedforward networks are a class of approximation techniques that can be used to learn to perform some tasks from a finite set of examples. The question of the capability of a network to generalize from a finite training set to unseen data is clearly of crucial importance. In this chapter, we bound the generalization error of a class of Radial Basis Functions, for certain well defined function learning tasks, in terms of the number of parameters and number of examples. We show that the total generalization error is partly due to the insufficient representational capacity of the network (because of the finite size of the network being used) and partly due to insufficient information about the target function because of the finite number of samples. Prior research has looked at representational capacity or sample complexity in isolation. In the spirit of A. Barron, H. White and S. Geman we develop a framework to look at both. While the bound that we derive is specific for Radial Basis Functions, a number of observations deriving from it apply to any approximation technique. Our result also sheds light on ways to choose an appropriate network architecture for a particular problem and the kinds of problems that can be effectively solved with finite resources, i.e., with finite number of parameters and finite amounts of data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer Science+Business Media New York
About this chapter
Cite this chapter
Niyogi, P. (1998). On the Relationship between Hyothesis Complexity, Sample Complexity and Generalization Error for Neural Networks. In: The Informational Complexity of Learning. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-5459-2_2
Download citation
DOI: https://doi.org/10.1007/978-1-4615-5459-2_2
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7493-0
Online ISBN: 978-1-4615-5459-2
eBook Packages: Springer Book Archive