Exploiting numerical-contextual knowledge to improve numerical reasoning over text in question answering수치-문맥 정보를 활용한 질의응답 환경 내에서의 수치 추론 능력 개선

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 82
  • Download : 0
Numerical reasoning over text is a challenging subtask in question answering (QA) that requires both the understanding of texts and numbers. However, existing language models that are often used as encoders in these numerical reasoning QA models tend to overly rely on the pre-existing parametric knowledge instead of relying on the given context to interpret such numbers in the given text. Our work proposes a novel attention masked reasoning model, the NC-BERT, that learns to leverage the number-related contextual knowledge to enhance the numerical reasoning capabilities of the QA model. The empirical results suggest that understanding of numbers in their context, and refining numerical information in the number embeddings lead to improved numerical reasoning accuracy and performance in DROP, a numerical QA dataset.
Advisors
Myaeng, Sung Hyonresearcher맹성현researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2022.2,[iii, 19 p. :]

URI
http://hdl.handle.net/10203/309524
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997575&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0