Extraction of independent discriminant features for improved classification performance향상된 인식 성능을 위한 변별적 독립특징의 추출

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 543
  • Download : 0
In this dissertation, a single stage linear semi-supervised extraction of discriminative independent features is proposed. Discriminant ICA $\It{d}$ICA presents a framework of linearly projecting multivariate data to a lower dimension where the features are maximally discriminant and have minimal redundancy. The optimization problem is formulated as the maximization of linear summation of Negentropy and weighted functional measure of classification. Motivated by independence among features, Fisher linear discriminant is used as the functional measure of classification. Under the unit covariance constraint of extracted $\It{d}$ICA features, maximization of between-class scatter of features matrix is theoretically the same as the maximization of Fisher linear discriminant. It not only reduces the computational complexity of the algorithm but also avoids singularity posed by small within-class scatter. Experimental results show improved classification performance when $\It{d}$ICA features are used for recognition tasks in comparison to unsupervised and supervised feature extraction techniques. $\It{d}$ICA features also give reduced data reconstruction error in comparison to LDA and ICA method based on Negentropy maximization. Prior to the work on $\It{d}$ICA, a new feature selection (filtering) method was proposed to select discriminant features from a set of unsupervised features. Proposed hybrid feature selection (HFS) method is applied to the union of feature subspaces selected by Fisher criterion and univariate feature-class mutual information (MI). It scores each feature as a linear weighted sum of its MI with multinomial class distribution and Fisher criterion score. HFS selects features with higher class discrimination in comparison to feature-class MI, Fisher criterion or unsupervised selection using variance of features; thus, resulting in improved recognition performance. In addition, it was also highlighted that HFS is an optimal feature selector ...
Advisors
Lee, Soo-Youngresearcher이수영researcher
Description
한국과학기술원 : 바이오및뇌공학과,
Publisher
한국과학기술원
Issue Date
2010
Identifier
455338/325007  / 020064504
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 바이오및뇌공학과, 2010.08, [ xii, 104 p. ]

Keywords

Mutual Information; Negentropy; Fisher Linear Discriminant; Discriminant ICA; User Profile Model; 상호의 정보 관련도 최대화; 서치 인 싱크로니; 피셔 변별적 분석; 음엔트로피; 변별적 독립 성분 분석

URI
http://hdl.handle.net/10203/27087
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=455338&flag=dissertation
Appears in Collection
BiS-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0