Approximate Bayesian MLP regularization for regression in the presence of noise

Cited 24 time in webofscience Cited 0 time in scopus
  • Hit : 603
  • Download : 0
We present a novel regularization method for a multilayer perceptron (MLP) that learns a regression function in the presence of noise regardless of how smooth the function is. Unlike general MLP regularization methods assuming that a regression function is smooth, the proposed regularization method is also valid when a regression function has discontinuities (non-smoothness). Since a true regression function to be learned is unknown, we examine a training set with our Bayesian approach that identifies non-smooth data, analyzing discontinuities in a regression function. The use of a Bayesian probability distribution identifies the non-smooth data. These identified data is used in a proposed objective function to fit an MLP response to the desired regression function regardless of its smoothness and noise. Experimental simulations show that the MLP with our presented training method yields more accurate fits to non-smooth functions than other MLP training methods. Further, we show that the suggested training methodology can be incorporated with deep learning models. (C) 2016 Elsevier Ltd. All rights reserved
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Issue Date
2016-11
Language
English
Article Type
Article
Keywords

NEURAL-NETWORK; WEIGHT DECAY; ALGORITHM; MACHINE; FRAMEWORK

Citation

NEURAL NETWORKS, v.83, pp.75 - 85

ISSN
0893-6080
DOI
10.1016/j.neunet.2016.07.010
URI
http://hdl.handle.net/10203/214403
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 24 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0