Model Compression via Pattern Shared Sparsification in Analog Federated Learning Under Communication Constraints

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 148
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorAhn, Jin-Hyunko
dc.contributor.authorBennis, Mehdiko
dc.contributor.authorKang, Joonhyukko
dc.date.accessioned2023-04-03T05:00:49Z-
dc.date.available2023-04-03T05:00:49Z-
dc.date.created2023-04-03-
dc.date.created2023-04-03-
dc.date.created2023-04-03-
dc.date.created2023-04-03-
dc.date.issued2023-03-
dc.identifier.citationIEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, v.7, no.1, pp.298 - 312-
dc.identifier.issn2473-2400-
dc.identifier.urihttp://hdl.handle.net/10203/305954-
dc.description.abstractRecently, it has been shown that analog transmission based federated learning enables more efficient usage of communication resources compared to the conventional digital transmission. In this paper, we propose an effective model compression strategy enabling analog FL under constrained communication bandwidth. To this end, the proposed approach is based on pattern shared sparsification by setting the same sparsification pattern of parameter vectors uploaded by edge devices, as opposed to each edge device independently applying sparsification. In particular, we propose specific schemes for determining the sparsification pattern and characterize the convergence of analog FL leveraging these proposed sparsification strategies, by deriving a closed-form upper boun d of convergence rate and residual error. The closed-form expression allows to capture the effect of communication bandwidth and power budget to the performance of analog FL. In terms of convergence analysis, the model parameter obtained with the proposed schemes is proven to converge to the optimum of model parameter. Numerical results show that leveraging the proposed pattern shared sparsification consistently improves the performance of analog FL in various settings of system parameters. The improvement in performance is more significant under scarce communication bandwidth and limited transmit power budget.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleModel Compression via Pattern Shared Sparsification in Analog Federated Learning Under Communication Constraints-
dc.typeArticle-
dc.identifier.wosid000965990300001-
dc.identifier.scopusid2-s2.0-85134264743-
dc.type.rimsART-
dc.citation.volume7-
dc.citation.issue1-
dc.citation.beginningpage298-
dc.citation.endingpage312-
dc.citation.publicationnameIEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING-
dc.identifier.doi10.1109/TGCN.2022.3186538-
dc.contributor.localauthorKang, Joonhyuk-
dc.contributor.nonIdAuthorAhn, Jin-Hyun-
dc.contributor.nonIdAuthorBennis, Mehdi-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorDistributed learning-
dc.subject.keywordAuthorfederated learning-
dc.subject.keywordAuthorover-the-air computation-
dc.subject.keywordAuthorcompression-
dc.subject.keywordAuthorlocal gradient accumulation-
dc.subject.keywordPlusSTOCHASTIC GRADIENT DESCENT-
dc.subject.keywordPlusTHE-AIR COMPUTATION-
dc.subject.keywordPlusOPTIMIZATION-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0