DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kang, Joonhyuk | - |
dc.contributor.advisor | 강준혁 | - |
dc.contributor.author | Ahn, Jin-Hyun | - |
dc.date.accessioned | 2021-05-12T19:45:30Z | - |
dc.date.available | 2021-05-12T19:45:30Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=924534&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/284446 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2020.8,[iv, 102 p. :] | - |
dc.description.abstract | In this thesis, we investigate the performance of Federated Learning (FL) with Federated Distillation (FD) and Hybrid Federated Distillation (HFD), which is proposed in this these, considering wireless implementations. First of all, we propose the analog and digital transmission schemes how to implement FD and HFD over wireless communication environment. And we compare the performances of FL, FD, and HFD under wireless implementations, even when considering both of uplink and downlink. Furthermore, we modify the analog transmission scheme of FL, especially when distributed stochastic gradient descent (DSGD) is adopted for weight update. The proposed schemes show considerable improvements, especially in the case of non i.i.d. data allocation with low number of allowed channel uses, which can not be supported by the previous analog transmission scheme well. And we propose a update strategy applied in digital transmission scheme for DSGD, which adopts element-wise averaging update scheme. The performance is considerably improved when the scheme is adopted to the state-of-the-art digital transmission for DSGD with error accumulation. Finally, we propose a local weight update scheme for the introduced DSGD implementations. With this scheme, both of digital and analog transmission schemes show considerable improvements compared to the previous DSGD implementations. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | distributed training▼afederated learning▼afederated distillation▼ajoint source-channel coding▼aover-the-air computation▼adistributed stochastic gradient descent▼aerror accumulation | - |
dc.subject | 분산 훈련▼a연합 학습▼a연합 증류▼a동시 소스 채널 코딩▼a공중 계산▼a분산 스토캐스틱 그래디언트 하강▼a오류 축적 | - |
dc.title | Research on federated learning and distillation with implementation in wireless communication environment | - |
dc.title.alternative | 연합 학습 및 연합 증류 기법 연구 및 무선 통신 환경에서의 구현을 위한 통신 기법 연구 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 안진현 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.