DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kim, Changick | - |
dc.contributor.advisor | 김창익 | - |
dc.contributor.author | Kang, HeeKwang | - |
dc.date.accessioned | 2018-06-20T06:21:11Z | - |
dc.date.available | 2018-06-20T06:21:11Z | - |
dc.date.issued | 2017 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=675338&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/243244 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2017.2,[iii, 38 p. :] | - |
dc.description.abstract | To compare image patches is a core task in many computer vision areas. A number of hand-crafted features have been used to find the most similar position to a given pattern from a target image. However, these approaches still suffer from many limitations in tough environments. In this paper, we propose a data-driven approach with convolutional neural networks(CNNs) for robust matching. We design new CNN architectures to measure similarity of two images and carry out template matching through the trained network. Consequently, we demonstrate that our template matching method achieves the state-of-the-art performance even in real-world environments. Moreover, we show our study to determine the suitable CNN architecture through network visualization. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Comparing image patches | - |
dc.subject | Template matching | - |
dc.subject | Deep learning | - |
dc.subject | Image correspondence | - |
dc.subject | Convolutional neural network | - |
dc.subject | 이미지 비교 | - |
dc.subject | 템플릿 매칭 | - |
dc.subject | 딥러닝 | - |
dc.subject | 이미지 대응 | - |
dc.subject | 컨볼루션 신경 회로망 | - |
dc.title | Comparing image patches using convolutional neural networks | - |
dc.title.alternative | 컨볼루션 신경망을 이용한 이미지 패치 비교 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 강희광 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.