Pre-training a Neural Model to Overcome Data Scarcity in Relation Extraction from Text

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 160
  • Download : 0
Data scarcity is a major stumbling block in relation extraction. We propose an unsupervised pre-training method for extracting relational information from a huge amount of unlabeled data prior to supervised learning in the situation where hard to make golden labeled data. An objective function not requiring any labeled data is adopted during the pre-training phase, with an attempt to predict clue words crucial for inferring semantic relation types between two entities in a given sentence. The experimental result on public datasets shows that our approach achieves similar performance by using only 70% of data in a data-scarce setting.
Publisher
IEEE
Issue Date
2019-03
Language
English
Citation

IEEE International Conference on Big Data and Smart Computing (BigComp), pp.176 - 180

ISSN
2375-933X
DOI
10.1109/BIGCOMP.2019.8679242
URI
http://hdl.handle.net/10203/274680
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0