Optimization of nonlinear regression models based on the complexity measure of estimation functions = 추정함수의 복잡도를 이용한 비선형 회귀모형의 최적화

In the recent decade, machine learning plays an important role in wide real life applications, because there are many problems not to solve the problem with formal mathematical classic algorithms. But there are some problems of machine learning, like over-fitting and under-fitting. To find the solution of these problems, we decide the optimal model which make a trade-off between over-fitting and under-fitting. This thesis investigate new methods using the prediction of the expected error, and using regularization method with respect to smoothness measure. First, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggested the form of risk estimates for the performance prediction model. As a result, the suggested CV method referred to as parameterize CV (p-CV) method using the performance prediction model. Secondly, we will suggest the prediction risk bounds of nonlinear regression models based on the the modulus of continuity for the target function and also for the simulation function. We also present the model selection criteria referred to as the modulus of continuity information criteria (MCIC) derived from the suggested prediction risk bounds. Through the simulation for function approximation, we have shown that the suggested MCIC is effective in nonlinear model selection problems with limited data. Finally, we presents a new method of regularization in regression problems using a Besov norm (or semi-norm) acting as a regularization operator. This norm is more general smoothness measure to general approximation spaces than Sobolev norm and RKHS norm. In our work, we also suggest a new candidate of the regularization parameter, that is, the trade-off between the data fit and the smoothness of the estimation function. Through the simulation for function approximation, we have shown that the sug...
Advisors
Kil, Rhee-Manresearcher길이만researcher
Publisher
한국과학기술원
Issue Date
2007
Identifier
263485/325007  / 020015018
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 응용수학전공, 2007.2, [ vii, 96 p. ]

Keywords

부드러움 측도; 정형화; 최적화; 비선형 회귀모형; generalization error; regularization; model selection; Besov space; modulus of conitnuity; complexity measure; optimization; nonlinear regression model; 모델 선택; Besov 공간; 일반화 오차

URI
http://hdl.handle.net/10203/41892
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=263485&flag=t
Appears in Collection
MA-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.
  • Hit : 75
  • Download : 0
  • Cited 0 times in thomson ci

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0