DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kim, Dae-Shik | - |
dc.contributor.advisor | 김대식 | - |
dc.contributor.author | Kim, Gyeongman | - |
dc.date.accessioned | 2022-04-27T19:30:51Z | - |
dc.date.available | 2022-04-27T19:30:51Z | - |
dc.date.issued | 2021 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948683&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/295927 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2021.2,[iii, 18 p. :] | - |
dc.description.abstract | Knowledge distillation is a popular network compressing method which improves the performance of a small network (student) by employing output logits of a pre-trained large network (teacher). However, previous studies undoubtedly trust that the teacher network would always give beneficial knowledge in the logits. In this study, we specify the problem that distilling unreliable knowledge from the prediction of teachers would cause degradation of students. To tackle this problem, we propose the balancing knowledge distillation method which regulates the degree of knowledge distillation by utilizing the prior data distribution from the trained teacher. The proposed method can reflect various data distributions that contain the reliability of knowledge. Our results show that the balancing method based on the prior data distribution improves knowledge distillation regardless of datasets. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Deep learning▼aNetwork compression▼aKnowledge distillation▼aReliability▼adata distribution | - |
dc.subject | 딥러닝▼a네트워크 압축▼a지식 증류▼a지식 전달▼a지식 신뢰도▼a데이터 분포 | - |
dc.title | Balancing knowledge distillation via reliability of knowledge | - |
dc.title.alternative | 지식의 신뢰도에 따른 균형 지식 증류 기법 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 김경만 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.