DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jung, Seokwoo | ko |
dc.contributor.author | Myaeng, Sung-Hyon | ko |
dc.date.accessioned | 2020-06-16T03:20:32Z | - |
dc.date.available | 2020-06-16T03:20:32Z | - |
dc.date.created | 2020-06-12 | - |
dc.date.created | 2020-06-12 | - |
dc.date.created | 2020-06-12 | - |
dc.date.issued | 2019-03 | - |
dc.identifier.citation | IEEE International Conference on Big Data and Smart Computing (BigComp), pp.176 - 180 | - |
dc.identifier.issn | 2375-933X | - |
dc.identifier.uri | http://hdl.handle.net/10203/274680 | - |
dc.description.abstract | Data scarcity is a major stumbling block in relation extraction. We propose an unsupervised pre-training method for extracting relational information from a huge amount of unlabeled data prior to supervised learning in the situation where hard to make golden labeled data. An objective function not requiring any labeled data is adopted during the pre-training phase, with an attempt to predict clue words crucial for inferring semantic relation types between two entities in a given sentence. The experimental result on public datasets shows that our approach achieves similar performance by using only 70% of data in a data-scarce setting. | - |
dc.language | English | - |
dc.publisher | IEEE | - |
dc.title | Pre-training a Neural Model to Overcome Data Scarcity in Relation Extraction from Text | - |
dc.type | Conference | - |
dc.identifier.wosid | 000469779800028 | - |
dc.identifier.scopusid | 2-s2.0-85064616052 | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 176 | - |
dc.citation.endingpage | 180 | - |
dc.citation.publicationname | IEEE International Conference on Big Data and Smart Computing (BigComp) | - |
dc.identifier.conferencecountry | US | - |
dc.identifier.conferencelocation | Kyoto, JAPAN | - |
dc.identifier.doi | 10.1109/BIGCOMP.2019.8679242 | - |
dc.contributor.localauthor | Myaeng, Sung-Hyon | - |
dc.contributor.nonIdAuthor | Jung, Seokwoo | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.