Training two-layered feedforward networks with variable projection method

Cited 36 time in webofscience Cited 0 time in scopus
  • Hit : 354
  • Download : 0
The variable projection (VP) method for separable nonlinear least squares.(SNLLS) is presented and incorporated into the Levenberg-Marquardt optimization algorithm for training two-layered feedforward neural networks. It is shown that the Jacobian of variable projected networks can be computed by simple modification of the back-propagation algorithm. The suggested algorithm is efficient compared to conventional techniques such as conventional Levenberg-Marquardt algorithm (LMA), hybrid gradient algorithm (HGA), and extreme learning machine (ELM).
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2008-02
Language
English
Article Type
Article
Keywords

NONLINEAR LEAST-SQUARES; MARQUARDT ALGORITHM

Citation

IEEE TRANSACTIONS ON NEURAL NETWORKS, v.19, pp.371 - 375

ISSN
1045-9227
DOI
10.1109/TNN.2007.911739
URI
http://hdl.handle.net/10203/90748
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 36 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0