Automated augmentation for knowledge distillation지식 증류를 위한 자동화된 데이터 증강 기법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 70
  • Download : 0
In the aspect of knowledge distillation, data augmentation techniques serve to augment input data, the medium through which knowledge is distilled from teacher to student. In this case, teacher network and augmentation for teacher pre-training affect the performance of augmentation. Thus, in this paper, we propose novel data augmentation search method with consideration of teacher network and augmentation for teacher. Based on automated augmentation, we demonstrate how to use KD loss to consider teacher network. Moreover, we propose $\textit{policy distance}$ to measure the difference between two augmentation policies. Policy distance is used to maximize the distance from teacher augmentation, in our objectives. We demonstrate the effect of our proposed method by analyzing data distribution changes by augmentations. Through the analysis of these various aspects, we show that our proposed method search an improved data augmentation policy for knowledge distillation.
Advisors
Yun, Se-Youngresearcher윤세영researcher
Description
한국과학기술원 :김재철AI대학원,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2022.2,[iii, 34 p. :]

URI
http://hdl.handle.net/10203/308192
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997687&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0