Loss-Scaled Large-Margin Gaussian Mixture Models for Speech Emotion Classification

This paper considers a learning framework for speech emotion classification using a discriminant function based on Gaussian mixture models (GMMs). The GMM parameter set is estimated by margin scaling with a loss function to reduce the risk of predicting emotions with high loss. Here, the loss function is defined as a function of a distance metric using the Watson and Tellegen's emotion model. Margin scaling is known to have good generalization ability and can be considered appropriate for emotion modeling where the parameter set is likely to be over-fitted to the training data set whose characteristics may differ from those of the testing data set. Our learning framework is formulated as a constrained optimization problem which is solved using semi-definite programming. Three tasks were evaluated: acted emotion classification, natural emotion classification, and cross database emotion classification. In each task, four loss functions were evaluated. In all experiments, results consistently show that margin scaling improves the classification accuracy over other learning frameworks based on the maximum-likelihood, maximum mutual information and max-margin framework without margin scaling. Experiment results also show that margin scaling substantially reduces the overall loss compared to the max-margin framework without margin scaling.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2012-02
Language
ENG
Keywords

RECOGNITION

Citation

IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, v.20, no.2, pp.585 - 598

ISSN
1558-7916
DOI
10.1109/TASL.2011.2162405
URI
http://hdl.handle.net/10203/100367
Appears in Collection
EE-Journal Papers(저널논문)
  • Hit : 142
  • Download : 11
  • Cited 0 times in thomson ci
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡClick to seewebofscience_button
⊙ Cited 20 items in WoSClick to see citing articles inrecords_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0