(A) study for efficient spatial attention module효율적인 공간적 주의 모듈을 위한 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 145
  • Download : 0
Spatial attention module is widely used in deep neural networks. The monumental attention module,Transformer [11] was proposed as self-attention and encoder-decoder frameworks in machine translationtask, and improving capability of learning long-range dependencies. After that, spatial attention modulessuch as Non-local block [12] and Criss-cross attention block [5] were also proposed and improving per-formance in vision fields including action recognition, segmentation, and object detection. Despite greatsuccess, spatial attention can only be used in a limited way due to expensive computation and memorycosts. To reduce overhead, segmentation or pooling should be used and several methods were proposed,but stacking enough blocks is still limited. Also, multi-head attention is not used in vision and videotasks due to the memory limitation. In this paper, we analyze which factors play an important role inlearning spatial attention module in the view of geometric definition, and propose ‘Transposed attentionmodule’ that is faster and smaller than Non-local block [12] in the same number of blocks and heads. Thebiggest advantage of our module is that memory and speed are maintained when the number of headsis increasing. Also, we introduce new approach to interpret spatial attention module and get superiorperformance compared to Non-local block [12] on CIFAR-10, CIFAR-100 [9], and Tiny-ImageNet.
Advisors
Kim, Junmoresearcher김준모researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2020
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2020.2,[iii, 19 p. :]

Keywords

Spatial attention▼ageometric definition▼amulti-head▼afaster and smaller▼aTransposed attention; 공간주의 모듈▼a다중머리 구조▼a기하학적인 정의

URI
http://hdl.handle.net/10203/284776
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=911406&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0