Neural networks based on hidden-layer error for improved performance and hardware implemention은닉층 오차에 의한 성능향상 및 하드웨어 구현을 위한 신경회로망

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 584
  • Download : 0
In this thesis we have developed two neural network models, which not only provide robust hetero-association and classification performance but also are suitable for hardware implementation. The multi-layer bidirectional associational associative memory (MBAM) is an extension of Kosko``s bidirectional associative memory (BAM) into multi-layer architecture, of which synapses are defined as sum of vector outer products. By storing the binary vectors, one can emulate multi-valued synapses with only binary storages for easy hardware implementations. To train the MBAM one need adjust the binary vectors by genetic algorithm or simulated annealing. A new error term is added to push the hidden-layer activations into saturation regions, which reduces effects of hard-limiting nonlinearity, instead of Sigmoid nonlinearity, at practical hardware implementations. Computer simulation and VLSI hardware implementation demonstrate much better performance than single-layer BAM. Also the bidirectional recall process provides much better error correction capability than multi-layer Perceptron. The additional hidden-layer error is also introduced to train multi-layer Perceptron for good generalization and robust classification. Good generalization is achievable only with proper combination of the training data size, the underlying problem complexity, and the network complexity. Unlike other methods which reduce network complexity to improve generalization by putting restrictions on synaptic weights, this algorithm increases complexity of the underlying problem by imposing appropriate additional requirements on hidden-layer neurons, i.e. low output sensitivity to the input values or equivalently saturation of hidden-layer activations. Learning is based on gradient-descent error minimization, and the additional gradient-descent term turns out to be Hebbian. Hence this new algorithm incorporates both the error back-propagation and Hebbian learning rules. The developed hybrid learning ...
Advisors
Lee, Soo-Youngresearcher이수영researcher
Description
한국과학기술원 : 전기 및 전자공학과,
Publisher
한국과학기술원
Issue Date
1994
Identifier
69655/325007 / 000845298
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기 및 전자공학과, 1994.8, [ iv, 97 p. ]

URI
http://hdl.handle.net/10203/36227
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=69655&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0