DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kim, Munchurl | - |
dc.contributor.advisor | 김문철 | - |
dc.contributor.author | Seo, Wonyong | - |
dc.date.accessioned | 2023-06-26T19:33:35Z | - |
dc.date.available | 2023-06-26T19:33:35Z | - |
dc.date.issued | 2023 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032894&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/309823 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2023.2,[iii, 35 p. :] | - |
dc.description.abstract | Optical flow estimation is a task that estimates per pixel movement in images that comes temporally sequential. Optical flow estimation can measure the movement of objects in video, but it also works as a pretask of action recognition, frame interpolation, video super-resolution, and other various tasks. When it works as a component of other downstream tasks, there are limitations in GPU memory and inference speed. Recently, most of optical flow estimation algorithms are based on all pair cost volumes, which computes the cosine similarity of all feature pixels in two input images. This kind of algorithm can achieve surprisingly increased optical estimation performance, but it also requires a quadratic increase of memory and inference time with respect to its input resolution. In this paper, we estimate optical flow by using a multi-axis transformer and achieve comparable performance while using fewer resources such as GPU memory and inference time | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Optical flow estimation▼alight-weight network▼acost volume | - |
dc.subject | 광학 흐름 예측▼a저복잡도▼a코스트 볼륨 | - |
dc.title | Light weight optical flow estimation using shuffle transformer | - |
dc.title.alternative | 셔플 트랜스포머를 이용한 저복잡도 광학 흐름 예측 연구 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 서원용 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.