Exploiting numerical-contextual knowledge to improve numerical reasoning over text in question answering수치-문맥 정보를 활용한 질의응답 환경 내에서의 수치 추론 능력 개선

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 83
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorMyaeng, Sung Hyon-
dc.contributor.advisor맹성현-
dc.contributor.authorKim, Jeonghwan-
dc.date.accessioned2023-06-26T19:31:25Z-
dc.date.available2023-06-26T19:31:25Z-
dc.date.issued2022-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997575&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/309524-
dc.description학위논문(석사) - 한국과학기술원 : 전산학부, 2022.2,[iii, 19 p. :]-
dc.description.abstractNumerical reasoning over text is a challenging subtask in question answering (QA) that requires both the understanding of texts and numbers. However, existing language models that are often used as encoders in these numerical reasoning QA models tend to overly rely on the pre-existing parametric knowledge instead of relying on the given context to interpret such numbers in the given text. Our work proposes a novel attention masked reasoning model, the NC-BERT, that learns to leverage the number-related contextual knowledge to enhance the numerical reasoning capabilities of the QA model. The empirical results suggest that understanding of numbers in their context, and refining numerical information in the number embeddings lead to improved numerical reasoning accuracy and performance in DROP, a numerical QA dataset.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.titleExploiting numerical-contextual knowledge to improve numerical reasoning over text in question answering-
dc.title.alternative수치-문맥 정보를 활용한 질의응답 환경 내에서의 수치 추론 능력 개선-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전산학부,-
dc.contributor.alternativeauthor김정환-
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0