Research on federated learning and distillation with implementation in wireless communication environment연합 학습 및 연합 증류 기법 연구 및 무선 통신 환경에서의 구현을 위한 통신 기법 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 224
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKang, Joonhyuk-
dc.contributor.advisor강준혁-
dc.contributor.authorAhn, Jin-Hyun-
dc.date.accessioned2021-05-12T19:45:30Z-
dc.date.available2021-05-12T19:45:30Z-
dc.date.issued2020-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=924534&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/284446-
dc.description학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2020.8,[iv, 102 p. :]-
dc.description.abstractIn this thesis, we investigate the performance of Federated Learning (FL) with Federated Distillation (FD) and Hybrid Federated Distillation (HFD), which is proposed in this these, considering wireless implementations. First of all, we propose the analog and digital transmission schemes how to implement FD and HFD over wireless communication environment. And we compare the performances of FL, FD, and HFD under wireless implementations, even when considering both of uplink and downlink. Furthermore, we modify the analog transmission scheme of FL, especially when distributed stochastic gradient descent (DSGD) is adopted for weight update. The proposed schemes show considerable improvements, especially in the case of non i.i.d. data allocation with low number of allowed channel uses, which can not be supported by the previous analog transmission scheme well. And we propose a update strategy applied in digital transmission scheme for DSGD, which adopts element-wise averaging update scheme. The performance is considerably improved when the scheme is adopted to the state-of-the-art digital transmission for DSGD with error accumulation. Finally, we propose a local weight update scheme for the introduced DSGD implementations. With this scheme, both of digital and analog transmission schemes show considerable improvements compared to the previous DSGD implementations.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectdistributed training▼afederated learning▼afederated distillation▼ajoint source-channel coding▼aover-the-air computation▼adistributed stochastic gradient descent▼aerror accumulation-
dc.subject분산 훈련▼a연합 학습▼a연합 증류▼a동시 소스 채널 코딩▼a공중 계산▼a분산 스토캐스틱 그래디언트 하강▼a오류 축적-
dc.titleResearch on federated learning and distillation with implementation in wireless communication environment-
dc.title.alternative연합 학습 및 연합 증류 기법 연구 및 무선 통신 환경에서의 구현을 위한 통신 기법 연구-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전기및전자공학부,-
dc.contributor.alternativeauthor안진현-
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0