Multiclass support vector machines with gaussian kernel functions가우스 요소함수에 기초한 다중분류 SVM의 최적화

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 478
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKil, Rhee-Man-
dc.contributor.advisor길이만-
dc.contributor.authorKim, Hye-Jin-
dc.contributor.author김혜진-
dc.date.accessioned2011-12-14T04:54:01Z-
dc.date.available2011-12-14T04:54:01Z-
dc.date.issued2001-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=169449&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/42036-
dc.description학위논문(석사) - 한국과학기술원 : 응용수학전공, 2001.8, [ 27 p. ]-
dc.description.abstractThis thesis provides a new approach of designing support vector machines (SVMs) with Gaussian kernel functions for multicategory classification: for the given training samples of multicategory classification, methodologies of how support vectors can be selected among training samples and how discriminant functions can be constructed for classifying the data, are suggested. A short background about generalization bound describing the relationship between the general and empirical risks of SVMs, is also described. For the classification problem, the SVM constructs the discriminant function representing the separating hyperplane in the sense of the structural risk minimization (SRM) principle, that is, optimizing the structure of SVM in the sense of minimizing the general risk. One weak point in this approach is that the SVM is intrinsically linear. For the nonlinear decision boundary, the SVM employs nonlinear kernel such as Gaussian kernel function instead of linear units. However, in this case, there is no systematic way of determining the kernel parameters using the SVM algorithm. In this sense, a new approach of determining the parameters of kernel functions in the sense of minimizing mean square error (MSE) after selecting the support vectors, is considered. The suggested algorithm is composed of two parts: 1) the first part is selecting support vectors among training samples using the optimization technique for a quadratic function defined by the summation of decision error and regularization term such as norm square of weight parameters of an estimation network, and 2) the second part is estimating the parameters of SVM with kernel functions using the minimization algorithm of mean square error. To show the effectiveness of suggested approach, the simulation for the classification of various benchmark data from UCI machine learning group is performed.eng
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectMulticlass Support Vector Machine-
dc.subject다중분류-
dc.titleMulticlass support vector machines with gaussian kernel functions-
dc.title.alternative가우스 요소함수에 기초한 다중분류 SVM의 최적화-
dc.typeThesis(Master)-
dc.identifier.CNRN169449/325007-
dc.description.department한국과학기술원 : 응용수학전공, -
dc.identifier.uid000993809-
dc.contributor.localauthorKil, Rhee-Man-
dc.contributor.localauthor길이만-
Appears in Collection
MA-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0