Structure minimization using impact factor in neural network

Despite many advances, the problem of determining the proper size of a neural network is important, especially for its practical implications in such issues as learning and generalization. Unfortunately, it is not usually obvious which size is best; a system that is too small will not be able to learn the data, while one that is just big enough may learn very slowly and be very sensitive to initial conditions and learning parameters. There are two types of approach to determining the network size: pruning and growing. Pruning consists of training a network which is larger than necessary, and then removing unnecessary weights/nodes. Here, a new pruning method is developed, based on the penalty-term method. This method makes the neural networks good for generalization, and reduces the retraining time needed after pruning weights/nodes.
Publisher
Springer Verlag
Issue Date
2002
Language
ENG
Citation

ARTIFICIAL LIFE AND ROBOTICS, v.6, no.3, pp.149 - 154

ISSN
1433-5298
URI
http://hdl.handle.net/10203/8320
Appears in Collection
EE-Journal Papers(저널논문)
  • Hit : 390
  • Download : 2
  • Cited 0 times in thomson ci

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0