FEWER:Federated Weight Recovery

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 285
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorShin,Yko
dc.contributor.authorLee, Gko
dc.contributor.authorShin, Sko
dc.contributor.authorYun, Seyoungko
dc.contributor.authorMoon, Il-Chulko
dc.date.accessioned2021-01-28T06:06:16Z-
dc.date.available2021-01-28T06:06:16Z-
dc.date.created2020-11-25-
dc.date.issued2020-12-01-
dc.identifier.citation1st Workshop on Distributed Machine Learning, DistributedML 2020, co-located with the 16th International Conference on emerging Networking EXperiments and Technologies, CoNEXT 2020-
dc.identifier.urihttp://hdl.handle.net/10203/280114-
dc.description.abstractIn federated learning, the local devices train the model with their local data, independently; and the server gathers the locally trained model to aggregate them into a shared global model. Therefore, federated learning is an approach to decouple the model training from directly assessing the local data. However, the requirement of periodic communications on model parameters results in a primary bottleneck for the efficiency of federated learning. This work proposes a novel federated learning algorithm, Federated Weight Recovery(FEWER), which enables a sparsely pruned model in the training phase. FEWER starts with the initial model training with an extremely sparse state, and FEWER gradually grows the model capacity until the model reaches a dense model at the end of the training. The level of sparsity becomes the leverage to either increasing the accuracy or decreasing the communication cost, and this sparsification can be beneficial to practitioners. Our experimental results show that FEWER achieves higher test accuracies with less communication costs for most of the test cases. © 2020 ACM.-
dc.languageEnglish-
dc.publisherACM-
dc.titleFEWER:Federated Weight Recovery-
dc.typeConference-
dc.identifier.scopusid2-s2.0-85097722853-
dc.type.rimsCONF-
dc.citation.publicationname1st Workshop on Distributed Machine Learning, DistributedML 2020, co-located with the 16th International Conference on emerging Networking EXperiments and Technologies, CoNEXT 2020-
dc.identifier.conferencecountrySP-
dc.identifier.conferencelocationVirtual-
dc.identifier.doi10.1145/3426745.3431335-
dc.contributor.localauthorYun, Seyoung-
dc.contributor.localauthorMoon, Il-Chul-
dc.contributor.nonIdAuthorShin,Y-
dc.contributor.nonIdAuthorLee, G-
dc.contributor.nonIdAuthorShin, S-
Appears in Collection
RIMS Conference PapersIE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0