Hierarchical Broadcast Coding: Expediting Distributed Learning at the Wireless Edge

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 284
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorHAN, DONGJUNko
dc.contributor.authorSohn, Jy Yongko
dc.contributor.authorMoon, Jaekyunko
dc.date.accessioned2021-04-22T06:30:07Z-
dc.date.available2021-04-22T06:30:07Z-
dc.date.created2020-12-01-
dc.date.created2020-12-01-
dc.date.issued2021-04-
dc.identifier.citationIEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, v.20, no.4, pp.2266 - 2281-
dc.identifier.issn1536-1276-
dc.identifier.urihttp://hdl.handle.net/10203/282522-
dc.description.abstractDistributed learning plays a key role in reducing the training time of modern deep neural networks with massive datasets. In this paper, we consider a distributed learning problem where gradient computation is carried out over a number of computing devices at the wireless edge. We propose hierarchical broadcast coding, a provable coding-theoretic framework to speed up distributed learning at the wireless edge. Our contributions are threefold. First, motivated by the hierarchical nature of real-world edge computing systems, we propose a layered code which mitigates the effects of not only packet losses at the wireless computing nodes but also straggling access points (APs or small base stations). Second, by strategically allocating data partitions to nodes in the overlapping areas between cells, our technique achieves the fundamental lower bound on computational load to combat stragglers. Finally, we take advantage of the broadcast nature of wireless networks by which wireless devices in the overlapping cell coverage broadcast to more than one AP. This further reduces the overall training time in the presence of straggling APs. Experimental results on Amazon EC2 confirm the advantage of the proposed methods in speeding up learning. Our design targets any gradient descent based learning algorithms, including linear/logistic regressions and deep learning.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleHierarchical Broadcast Coding: Expediting Distributed Learning at the Wireless Edge-
dc.typeArticle-
dc.identifier.wosid000639747400008-
dc.identifier.scopusid2-s2.0-85097953091-
dc.type.rimsART-
dc.citation.volume20-
dc.citation.issue4-
dc.citation.beginningpage2266-
dc.citation.endingpage2281-
dc.citation.publicationnameIEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS-
dc.identifier.doi10.1109/TWC.2020.3040792-
dc.contributor.localauthorMoon, Jaekyun-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorDistributed learning-
dc.subject.keywordAuthorGradient descent-
dc.subject.keywordAuthorStragglers-
dc.subject.keywordAuthorWireless edge-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0