Abstract
Radial Basis Function (RBF) networks are examples of a versatile artificial neural network paradigm which lend themselves equally well to problems of classification and regression. Training the networks can be accomplished by a number of textbook techniques. The objective of the current paper is to explore how uncertainty propagates through such networks. In this, the first of two papers, the regression problem is addressed. The RBF networks are trained with crisp data, but interval output weights, in such a way that a regression model predicts an interval rather than a crisp value. This technique, as developed for the more common Multi-Layer Perceptron (MLP) network allows the user to investigate Ben-Haim’s concept of opportunity.
Original language | English |
---|---|
Pages | 923-928 |
Number of pages | 5 |
Publication status | Published - 2005 |
Event | Eurodyn 2005: 6th International Conference on Structural Dynamics - Paris, France Duration: 4 Sept 2005 → 7 Sept 2005 |
Conference
Conference | Eurodyn 2005: 6th International Conference on Structural Dynamics |
---|---|
Country/Territory | France |
City | Paris |
Period | 4/09/05 → 7/09/05 |
Keywords
- multi-layer perceptron network
- Radial Basis Function
- Ben-Haim’s concept of
- artificial neural network