Utilizing a transformer for link prediction링크 예측을 위한 트랜스포머 활용

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 145
  • Download : 0
Transformers have been widely employed and demonstrated success in many domains such as language and vision. While recent approaches have extended and applied them to the graph domain, e.g., graph property prediction, their success has been limited to predicting the properties of small-scale graphs. However, in practice, it is crucial to efficiently predict graph properties (e.g., node and edge) for large graph-structured data such as those from the Internet and biology. To this end, we propose a novel Link Prediction Transformer, called LiT, that exploits specialized tokens from the extracted subgraphs as embedding inputs to the Transformer for link prediction in a large-scale graph. With our proposed type, node, and position identifiers that effectively encode the graph properties, we verify that local subgraphs embed sufficient information for predicting links in a graph, and demonstrate at least comparable or superior performance to those of Graph Neural Networks (GNN)-based state-of-the-art models. We further identify important nodes and edges from the attention score matrices and examine the effect of each identifier and each encoding in the position identifier in subgraph token encoder.
Advisors
Kim, Myoung Horesearcher김명호researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2023.2,[iii, 21 p. :]

Keywords

Artificial Intelligence▼aDeep Learning▼aGraph Neural Networks▼aTransformers▼aLink Prediction; 인공지능▼a심층학습▼a그래프뉴럴네트워크▼a트랜스포머▼a링크예측

URI
http://hdl.handle.net/10203/309507
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1033101&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0