Attention in deep neural networks for object recognition = 객체 인식을 위한 깊은 인공 신경망에서의 주의집중 매커니즘

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 15
  • Download : 0
It is well known that attention plays an important role in human perception [31, 57, 10]. One important property of a human visual system is that one does not attempt to process a whole scene at once. Instead, humans exploit a sequence of partial glimpses and selectively focus on salient parts in order to capture visual structure better [37]. There have been only few attempts to incorporate attention mechanism for improving performance of convolutional neural networks (CNNs) in recognition tasks. In this dissertation, we focus on how to utilize ‘attention mechanism’ in the context of deep CNN design for object recognition. We make the following hypothesis; Assuming CNN as an approximator of a human visual system, adding attention mechanisms within CNN will facilitate the effective feature learning. We propose a two types of attention-integrated deep CNN: attention in a network backbone and attention in a task specific head. Specifically, we design a simple yet effective attention module called, convolutional block attention module (CBAM) and apply it to both backbone and task specific head of deep CNN. We conduct extensive subjective and objective evaluation and show the efficacy of the proposed method in both types.
Advisors
Kweon, In Soresearcher권인소researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2019
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2019.2,[v, 40 p. :]

Keywords

Attention mechanism▼adeep learning▼aobject recognition; 주의집중 매커니즘▼a딥러닝▼a객체인식

URI
http://hdl.handle.net/10203/266706
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=843405&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0