Sparsification on Different Federated Learning Schemes: Comparative Analysis

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 72
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Jongmyeongko
dc.contributor.authorJang, Youngsuko
dc.contributor.authorLee, Janghoko
dc.contributor.authorKang, Joonhyukko
dc.date.accessioned2022-12-05T01:01:05Z-
dc.date.available2022-12-05T01:01:05Z-
dc.date.created2022-12-02-
dc.date.created2022-12-02-
dc.date.issued2022-10-20-
dc.identifier.citationThe 13th International Conference on ICT Convergence, ICTC 2022, pp.2044 - 2047-
dc.identifier.urihttp://hdl.handle.net/10203/301578-
dc.description.abstractHigh communication overhead is a major bottleneck in federated learning (FL). To overcome this issue, sparsification is utilized in various compression frameworks. Generally, local clients upload the updated weights to the server. However, in sparsification, we observed that local clients upload the difference between the updated weights and the original weights. Our study is to confirm the importance of uploading the difference of weights in sparsification and to observe how different the accuracy between the two schemes is.-
dc.languageEnglish-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleSparsification on Different Federated Learning Schemes: Comparative Analysis-
dc.typeConference-
dc.identifier.scopusid2-s2.0-85143256431-
dc.type.rimsCONF-
dc.citation.beginningpage2044-
dc.citation.endingpage2047-
dc.citation.publicationnameThe 13th International Conference on ICT Convergence, ICTC 2022-
dc.identifier.conferencecountryKO-
dc.identifier.conferencelocationRamada Plaza Hotel Jeju & Online-
dc.identifier.doi10.1109/ICTC55196.2022.9952431-
dc.contributor.localauthorKang, Joonhyuk-
dc.contributor.nonIdAuthorLee, Jongmyeong-
dc.contributor.nonIdAuthorLee, Jangho-
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0