Language model adaptation with neural mask generator and domain knowledge graph신경망 기반 마스크 생성자와 도메인 지식 그래프를 통한 언어 모델 적응

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 117
  • Download : 0
In this thesis, we address methods for language model adaptation which handles the case where the pre-trained language model is transferred to the specific domain. First, we introduce the Neural Mask Generator (NMG) which generates the adaptive word maskings for language model adaptation with further pre-training based on the Masked Language Model. We empirically show that the NMG shows better adaptation performance on Natural Language Understanding (NLU) tasks across various domains compared to rule-based baselines. Second, we propose the framework that integrates the domain knowledge graph in the fine-tuning stage of the language model adaptation. We introduce the novel layer named knowledge-guided attention for integrating the knowledge into the language model. We experimentally show that the use of our framework achieves comparable or better performance on NLU tasks across various domains compared to the existing knowledge-based method and further pre-training in the language model adaptation scenario.
Advisors
Hwang, Sung Juresearcher황성주researcher
Description
한국과학기술원 :AI대학원,
Publisher
한국과학기술원
Issue Date
2021
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : AI대학원, 2021.8,[iv, 36 p. :]

Keywords

Natural Language Understanding▼aPre-trained Language Model▼aDomain▼aLanguage Model Adaptation▼aMeta Learning▼aReinforcement Learning▼aKnowledge Graph; 자연어 이해▼a사전 학습된 언어 모델▼a도메인▼a언어 모델 적응▼a메타 학습▼a강화 학습▼a지식 그래프

URI
http://hdl.handle.net/10203/294852
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=963751&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0