Extraction of independent discriminant features for improved classification performance = 향상된 인식 성능을 위한 변별적 독립특징의 추출

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 399
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorLee, Soo-Young-
dc.contributor.advisor이수영-
dc.contributor.authorDhir, Chandra Shekhar-
dc.contributor.authorDhir, C. S.-
dc.contributor.author디르, 찬드라-
dc.date.accessioned2011-12-12T07:25:58Z-
dc.date.available2011-12-12T07:25:58Z-
dc.date.issued2010-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=455338&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/27087-
dc.description학위논문(박사) - 한국과학기술원 : 바이오및뇌공학과, 2010.08, [ xii, 104 p. ]-
dc.description.abstractIn this dissertation, a single stage linear semi-supervised extraction of discriminative independent features is proposed. Discriminant ICA $\It{d}$ICA presents a framework of linearly projecting multivariate data to a lower dimension where the features are maximally discriminant and have minimal redundancy. The optimization problem is formulated as the maximization of linear summation of Negentropy and weighted functional measure of classification. Motivated by independence among features, Fisher linear discriminant is used as the functional measure of classification. Under the unit covariance constraint of extracted $\It{d}$ICA features, maximization of between-class scatter of features matrix is theoretically the same as the maximization of Fisher linear discriminant. It not only reduces the computational complexity of the algorithm but also avoids singularity posed by small within-class scatter. Experimental results show improved classification performance when $\It{d}$ICA features are used for recognition tasks in comparison to unsupervised and supervised feature extraction techniques. $\It{d}$ICA features also give reduced data reconstruction error in comparison to LDA and ICA method based on Negentropy maximization. Prior to the work on $\It{d}$ICA, a new feature selection (filtering) method was proposed to select discriminant features from a set of unsupervised features. Proposed hybrid feature selection (HFS) method is applied to the union of feature subspaces selected by Fisher criterion and univariate feature-class mutual information (MI). It scores each feature as a linear weighted sum of its MI with multinomial class distribution and Fisher criterion score. HFS selects features with higher class discrimination in comparison to feature-class MI, Fisher criterion or unsupervised selection using variance of features; thus, resulting in improved recognition performance. In addition, it was also highlighted that HFS is an optimal feature selector ...eng
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectMutual Information-
dc.subjectNegentropy-
dc.subjectFisher Linear Discriminant-
dc.subjectDiscriminant ICA-
dc.subjectUser Profile Model-
dc.subject상호의 정보 관련도 최대화-
dc.subject서치 인 싱크로니-
dc.subject피셔 변별적 분석-
dc.subject음엔트로피-
dc.subject변별적 독립 성분 분석-
dc.titleExtraction of independent discriminant features for improved classification performance = 향상된 인식 성능을 위한 변별적 독립특징의 추출-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN455338/325007 -
dc.description.department한국과학기술원 : 바이오및뇌공학과, -
dc.identifier.uid020064504-
dc.contributor.localauthorLee, Soo-Young-
dc.contributor.localauthor이수영-
Appears in Collection
BiS-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0