An improvement of decision tree-based unseen model prediction결정트리를 이용한 비관측모델 추정밥법 개선

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 427
  • Download : 0
In large vocabulary or vocabulary independent speech recognition, if using triphone models as acoustic models, we often encounter unseen triphone models that are not covered in the training data. These unseen triphone models can be predicted by using decision trees. Decision tree-based state tying has been proposed in the recent years as the most popular approach for clustering the states of context-dependent hidden Markov models. The aims of decision tree-based state tying are to reduce the number of free parameters and to predict the probability distributions of the unseen models. For more accurate tree-based unseen model prediction, the important issues are how to define the question sets and the stop criteria, and how to select the question that optimally splits the current node into child nodes. To predict the unseen models for vocabulary-independent speech recognition, we proposed three methods in constructing decision trees and observed on the effects of CMN on the unseen model prediction by using these methods. Two of these methods are related to stop criteria and question selection. The other is combined with the two methods aforementioned. The first method automatically decides the threshold values for stop criteria, and the second uses an additional condition in the question selection in order to overcome the drawback of the conventional question selection. The last uses probability distributions of fairly trained states in order to reliably predict the unseen models. In this thesis, we perform an experiment on four different cases to test several environments of vocabulary-independent speech recognition. The first case is that the test vocabulary is totally different from the training vocabulary. In the second case, the test vocabulary is different from the training vocabulary with 75%. The third case is that the test vocabulary is different from the training vocabulary with 50%, and finally, the forth case is that the test vocabulary is differe...
Advisors
Kim, Hoi-Rinresearcher김회린researcher
Description
한국정보통신대학원대학교 : 공학부,
Publisher
한국정보통신대학교
Issue Date
2003
Identifier
392287/225023 / 020014096
Language
eng
Description

학위논문(석사) - 한국정보통신대학원대학교 : 공학부, 2003, [ ix, 50 p. ]

Keywords

결정트리; 비관측모델

URI
http://hdl.handle.net/10203/55208
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=392287&flag=dissertation
Appears in Collection
School of Engineering-Theses_Master(공학부 석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0