Cooperative Learning via Federated Distillation over Fading Channels

Cited 15 time in webofscience Cited 9 time in scopus
  • Hit : 181
  • Download : 0
Cooperative training methods for distributed machine learning are typically based on the exchange of local gradients or local model parameters. The latter approach is known as Federated Learning (FL). An alternative solution with reduced communication overhead, referred to as Federated Distillation (FD), was recently proposed that exchanges only averaged model outputs. While prior work studied implementations of FL over wireless fading channels, here we propose wireless protocols for FD and for an enhanced version thereof that leverages an offline communication phase to communicate "mixed-up" covariate vectors. The proposed implementations consist of different combinations of digital schemes based on separate source-channel coding and of over-the-air computing strategies based on analog joint source-channel coding. It is shown that the enhanced version FD has the potential to significantly outperform FL in the presence of limited spectral resources.
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2020-05-04
Language
English
Citation

45th IEEE International Conference on Acoustics, Speech, and Signal Processing, pp.8856 - 8860

ISSN
1520-6149
DOI
10.1109/ICASSP40776.2020.9053448
URI
http://hdl.handle.net/10203/274416
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 15 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0