DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Yum, Bong-Jin | - |
dc.contributor.advisor | 염봉진 | - |
dc.contributor.author | Hwang, Sang-Heum | - |
dc.contributor.author | 황상흠 | - |
dc.date.accessioned | 2013-09-13 | - |
dc.date.available | 2013-09-13 | - |
dc.date.issued | 2012 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=511407&flag=dissertation | - |
dc.identifier.uri | http://hdl.handle.net/10203/182537 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 산업및시스템공학과, 2012.8, [ viii, 136 p. ] | - |
dc.description.abstract | Regression and classification are frequently employed machine learning tasks to estimate the underlying functional relationship between input and output variables in a data set. In many practical applications, the underlying functional relationship may be highly nonlinear, and kernel-based methods have appeared and assumed a central role in machine learning area for exploring such nonlinearities effectively. Most of the existing kernel-based learning algorithms have been developed under the assumption that a given data set consists of observations independently taken from the same distribution. Consequently, the results of such algorithms are highly sensitive to the outliers in a data set. To provide reliable results even if a given data set contains outlying observations, robust learning algorithms for regression and classification are developed in this thesis in the kernel framework. For regression, a new robust kernel-based regression algorithm is developed in the first study. The proposed method utilizes a weighting scheme based on the hat matrix similar to the generalized M-estimator of conventional robust linear regression. The diagonal elements of the hat matrix in the kernel-induced feature space are used as leverage measures to reduce the effects of outliers. Computational results from simulated examples and real data sets show the robustness of the proposed method compared to the existing approaches. Another kernel-based regression method insensitive to outliers is developed in the second study based on the relevance vector machine (RVM) with a weighting strategy. The proposed method has several advantages in that it provides statistical intervals and requires no validation data set. A semiconductor plasma etching process is used as a case study to compare the predictive performance of the proposed method with that of other regression methods. Experimental results demonstrate that the proposed method can be used for the purpose of predicting the qual... | eng |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Supervised learning | - |
dc.subject | Kernel method | - |
dc.subject | Outlier | - |
dc.subject | Relevance vector machine | - |
dc.subject | 지도학습 | - |
dc.subject | 커널방법 | - |
dc.subject | 이상치 | - |
dc.subject | 강건학습 | - |
dc.subject | 렐레번스 벡터 머신 | - |
dc.subject | Variational inference | - |
dc.title | Development of robust kernel-based learning algorithms for regression and classification problems | - |
dc.title.alternative | 회귀분석과 분류 문제를 위한 강건한 커널 학습방법의 개발 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 511407/325007 | - |
dc.description.department | 한국과학기술원 : 산업및시스템공학과, | - |
dc.identifier.uid | 020087101 | - |
dc.contributor.localauthor | Yum, Bong-Jin | - |
dc.contributor.localauthor | 염봉진 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.