DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kim, Changick | - |
dc.contributor.advisor | 김창익 | - |
dc.contributor.author | Eun, Hyunjun | - |
dc.date.accessioned | 2021-05-12T19:47:20Z | - |
dc.date.available | 2021-05-12T19:47:20Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=947913&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/284549 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2020.2,[viii, 80 p. :] | - |
dc.description.abstract | Detecting actions in untrimmed videos is an important yet challenging task. In this dissertation, we propose two new approaches for offline and online action detection. First, we introduce a temporal convolutional network, called Temporal Relation Network (TRN), for offline action detection. Specifically, TRN produces a 2D relatedness score map based on a novel concept of snippet relatedness representing which snippets are related to a specific action instances, and then TRN evaluates the action confidence scores of the temporal intervals and refines their boundaries to obtain temporal action proposals. Our relatedness score map enables the generation of various temporal intervals reliably covering most action instances with high overlap. On two benchmark datasets, THUMOS-14 and ActivityNet-1.3, the proposed method outperforms state-of-the-art methods for temporal action proposal generation. Furthermore, TRN leads to significant improvements in temporal action detection by combining existing action classification networks.Second, we present a novel method, named Temporal Filtering Network (TFN), for online action detection. TFN aims to distinguish between relevant and irrelevant information from a streaming, untrimmed video. To this end, we introduce a filtering module to learn relevance scores indicating how relevant the information is to a current action. Our filtering module emphasizes the relevant information to a current action, while it filters out the information of background and unrelated actions. We conduct extensive experiments on THUMOS-14 and TVSeries datasets, where the proposed method outperforms state-of-the-art methods by a large margin. We also show the effectiveness of the filtering module through comprehensive ablation studies. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | temporal action detection | - |
dc.subject | offline action detection | - |
dc.subject | online action detection | - |
dc.subject | temporal relation network | - |
dc.subject | temporal filtering network | - |
dc.subject | 시간적 행동 검출 | - |
dc.subject | 오프라인 행동 검출 | - |
dc.subject | 온라인 행동 검출 | - |
dc.subject | 시간적 관계 네트워크 | - |
dc.subject | 시간적 필터링 네트워크 | - |
dc.title | Temporal Convolutional Networks for Offline and Online Action Detection | - |
dc.title.alternative | 오프라인 및 온라인 행동 검출을 위한 시간적 컨볼루션 네트워크 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 은현준 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.