Pre-training a Neural Model to Overcome Data Scarcity in Relation Extraction from Text

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 161
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorJung, Seokwooko
dc.contributor.authorMyaeng, Sung-Hyonko
dc.date.accessioned2020-06-16T03:20:32Z-
dc.date.available2020-06-16T03:20:32Z-
dc.date.created2020-06-12-
dc.date.created2020-06-12-
dc.date.created2020-06-12-
dc.date.issued2019-03-
dc.identifier.citationIEEE International Conference on Big Data and Smart Computing (BigComp), pp.176 - 180-
dc.identifier.issn2375-933X-
dc.identifier.urihttp://hdl.handle.net/10203/274680-
dc.description.abstractData scarcity is a major stumbling block in relation extraction. We propose an unsupervised pre-training method for extracting relational information from a huge amount of unlabeled data prior to supervised learning in the situation where hard to make golden labeled data. An objective function not requiring any labeled data is adopted during the pre-training phase, with an attempt to predict clue words crucial for inferring semantic relation types between two entities in a given sentence. The experimental result on public datasets shows that our approach achieves similar performance by using only 70% of data in a data-scarce setting.-
dc.languageEnglish-
dc.publisherIEEE-
dc.titlePre-training a Neural Model to Overcome Data Scarcity in Relation Extraction from Text-
dc.typeConference-
dc.identifier.wosid000469779800028-
dc.identifier.scopusid2-s2.0-85064616052-
dc.type.rimsCONF-
dc.citation.beginningpage176-
dc.citation.endingpage180-
dc.citation.publicationnameIEEE International Conference on Big Data and Smart Computing (BigComp)-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationKyoto, JAPAN-
dc.identifier.doi10.1109/BIGCOMP.2019.8679242-
dc.contributor.localauthorMyaeng, Sung-Hyon-
dc.contributor.nonIdAuthorJung, Seokwoo-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0