AST based code summarization using context-aware graph transformer구문트리와 그래프 트랜스포머를 활용한 소스 코드 요약 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 68
  • Download : 0
When applying the Transformer architecture to source code, designing a good self-attention mechanism is critical as it affects how node relationship is extracted from the Abstract Syntax Trees (ASTs) of the source code. We present Code Structure Aware Transformer (CSA-Trans), which uses Code Structure Embedder (CSE) to generate specific PE for each node in AST. CSE generates node Positional Encoding (PE) using disentangled attention. To further extend the self-attention capability, we adopt Stochastic Block Model (SBM) attention. Our evaluation shows that our PE captures the relationships between AST nodes better than other graph-related PE techniques. We also show through quantitative and qualitative analysis that SBM attention is able to generate more node specific attention coefficients. We demonstrate that CSA-Trans outperforms 14 baselines in code summarization tasks for both Python and Java, while being 41.92% faster and 25.31% memory efficient in Java dataset compared to AST-Trans and SG-Trans respectively.
Advisors
유신researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2024
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2024.2,[iv, 28 p. :]

Keywords

딥러닝▼a그래프 뉴럴 네트워크▼a트랜스포머▼a코드 요약; Deep learning▼aGNN▼aTransformer▼aCode summarization

URI
http://hdl.handle.net/10203/321788
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1097313&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0