On the Relationship between Generalization Error and Sample Complexity for Radial Basis Functions and Incremental Training Algorithm

( EE 329 Final Presentation, A Team Project )

Key Point: The problem of learning a mapping between the input and output space is essentially equivalent to the problem of synthesizing an associative memory that retrieves the appropriate output when presented with the input and generalizes when presented with new inputs. The mapping usually takes the form of some unknown function between two spaces and the evidence is often a set of noisy examples i.e. ( x, y ) pairs which are consistent with this function. On the basis of data set, the learner tries to infer the true function and make predictions via extrapolation.

The unknown function is assumed to belong to some class F which is often called concept class in the terminology of computational learning theory. The learner is provided with a finite data set. One may draw a lot of assumptions about how this data set is collected but a general approach we are going to discuss is that the data is drawn by sampling the input output space ( X, Y ) according to some unknown probability distribution. On the basis of this data set, the learner then develops a hypothesis about the identity of the target function i.e., it comes up with a function chosen from some class, say H (the hypothesis class) which best fits the data and postulates this to be the target. The hypothesis class can also be of different kinds. For example, they could be classes of boolean functions, spline functions, radial basis functions and so on. One such class which is increasingly used for learning and prediction is the class of feedforward neural networks. The purpose of the project paper is two-fold. First, we formalize the problem of learning from examples so as to give a clear concept of the relationship between hypothesis complexity, sample complexity and total error. Second, we explore this relationship in the specific hypothesis class, the class of radial basis function networks which can be considered as one of the broad class of feedforward networks...

Clike here to download the full document for EE 329 project

Clike here to view the presentation slides and know more

any comments?