Structural optimization of radial basis function networks for function approximation함수근사를 위한 Radial basis function network의 구조적 최적화

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 510
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKil, Rhee-Man-
dc.contributor.advisor길이만-
dc.contributor.authorKu, Im-Hoi-
dc.contributor.author구임회-
dc.date.accessioned2011-12-14T04:53:48Z-
dc.date.available2011-12-14T04:53:48Z-
dc.date.issued2001-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=166240&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/42022-
dc.description학위논문(석사) - 한국과학기술원 : 응용수학전공, 2001.2, [ 30 p. ]-
dc.description.abstractThis thesis presents a new method of regression based on radial basis functions networks (RBFNs). The RBFNs are trained from a set of training samples to solve the problem of function approximation. In this training, minimizing the true error for the whole distribution of sample space not just a set of training samples, is a very critical problem. This is refer to as the generalization problem. To cope with this problem, the validation set, a part of training samples is extracted and the rest of training samples are used for training the regression models. The validation set is then used to check whether the regression model is overfitting to the training samples. In this thesis, a new approach of regression without the validation set is considered. An error confidence interval is estimated for the regression model to check the training of regression model instead of using the validation set. Especially, a form of error confidence intervals for the regression of RBFNs is derived from the view point of statistics and coefficients of an error confidence intervals are estimated for the specific training samples. We have shown that the gradients of the estimated errors which are obtained by adding the error confidence intervals and training errors, are consistent over various validation sets. The gradients of estimated errors could be a candidate of stopping criteria for the training of regression models. We will refine the method of optimizing the regression model based on the error confidence interval.eng
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectfunction approximation-
dc.subjectRadial Basis Function Network-
dc.subject신경회로망-
dc.subject함수근사-
dc.titleStructural optimization of radial basis function networks for function approximation-
dc.title.alternative함수근사를 위한 Radial basis function network의 구조적 최적화-
dc.typeThesis(Master)-
dc.identifier.CNRN166240/325007-
dc.description.department한국과학기술원 : 응용수학전공, -
dc.identifier.uid000993028-
dc.contributor.localauthorKil, Rhee-Man-
dc.contributor.localauthor길이만-
Appears in Collection
MA-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0