Elastic learning rate on error backpropagation of online update

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 299
  • Download : 0
The error-backpropagation (EBP) algorithm for learning multilayer perceptrons (MLPs) is known to have good features of robustness and economical efficiency. However, the algorithm has difficulty in selecting an optimal constant learning rate and thus results in non-optimal learning speed and inflexible operation for working data. This paper introduces an elastic learning rate that guarantees convergence of learning and its local realization by online update of MLP parameters into the original EBP algorithm in order to complement the non-optimality. The results of experiments on a speaker verification system with Korean speech database are presented and discussed to demonstrate the performance improvement of the proposed method in terms of learning speed and flexibility for working data of the original EBP algorithm.
Publisher
SPRINGER-VERLAG BERLIN
Issue Date
2004
Language
English
Article Type
Article; Proceedings Paper
Citation

PRICAI 2004: TRENDS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS BOOK SERIES: LECTURE NOTES IN ARTIFICIAL INTELLIGENCE, v.3157, pp.272 - 281

ISSN
0302-9743
URI
http://hdl.handle.net/10203/85229
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0