Language Model Adaptation Based on Topic Probability of Latent Dirichlet Allocation

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 445
  • Download : 579
DC FieldValueLanguage
dc.contributor.authorJeon, Hyung Baeko
dc.contributor.authorLee, Soo-Youngko
dc.date.accessioned2016-07-07T05:36:15Z-
dc.date.available2016-07-07T05:36:15Z-
dc.date.created2016-06-14-
dc.date.created2016-06-14-
dc.date.created2016-06-14-
dc.date.issued2016-06-
dc.identifier.citationETRI JOURNAL, v.38, no.3, pp.487 - 493-
dc.identifier.issn1225-6463-
dc.identifier.urihttp://hdl.handle.net/10203/209828-
dc.description.abstractTwo new methods are proposed for an unsupervised adaptation of a language model (LM) with a single sentence for automatic transcription tasks. At the training phase, training documents are clustered by a method known as Latent Dirichlet allocation (LDA), and then a domain-specific LM is trained for each cluster. At the test phase, an adapted LM is presented as a linear mixture of the now trained domain-specific LMs. Unlike previous adaptation methods, the proposed methods fully utilize a trained LDA model for the estimation of weight values, which are then to be assigned to the now trained domain-specific LMs; therefore, the clustering and weight-stimation algorithms of the trained LDA model are reliable. For the continuous speech recognition benchmark tests, the proposed methods outperform other unsupervised LM adaptation methods based on latent semantic analysis, non-negative matrix factorization, and LDA with n-gram counting.-
dc.languageEnglish-
dc.publisherELECTRONICS TELECOMMUNICATIONS RESEARCH INST-
dc.titleLanguage Model Adaptation Based on Topic Probability of Latent Dirichlet Allocation-
dc.typeArticle-
dc.identifier.wosid000377082900009-
dc.identifier.scopusid2-s2.0-84974574917-
dc.type.rimsART-
dc.citation.volume38-
dc.citation.issue3-
dc.citation.beginningpage487-
dc.citation.endingpage493-
dc.citation.publicationnameETRI JOURNAL-
dc.identifier.doi10.4218/etrij.16.0115.0499-
dc.embargo.liftdate2017-04-02-
dc.embargo.terms2017-04-02-
dc.contributor.localauthorLee, Soo-Young-
dc.description.isOpenAccessY-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorLanguage model adaptation-
dc.subject.keywordAuthortopic model-
dc.subject.keywordAuthorLatent Dirichlet allocation-
dc.subject.keywordAuthorweighted mixture model-
dc.subject.keywordAuthorLDA-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
95655.pdf(815.07 kB)Download
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0