Bayesian learning and unlearning in distributed wireless network무선 분산 네트워크에서 베이지안 학습 및 비학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 149
  • Download : 0
Bayesian federated learning (FL) offers a principled framework for the definition of collaborative training algorithms that are able to quantify epistemic uncertainty and to produce trustworthy decisions. Upon the completion of collaborative training, an agent may decide to exercise her legal ``right to be forgotten'', which calls for her contribution to the jointly trained model to be deleted and discarded. This paper studies FL and unlearning in a decentralized network within a Bayesian framework. It specifically develops federated variational inference solutions based on the decentralized solution of local free energy minimization problems within exponential-family models and on local gossip-driven communication. Then, this paper proposes to leverage the flexibility of non-parametric Bayesian approximate inference to develop a novel Bayesian federated unlearning method, referred to as \emph{Forget-Stein Variational Gradient Descent (Forget-SVGD)}. Variational \emph{particle-based} Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques. Upon the completion of FL, Forget-SVGD carries out local SVGD updates at the agents whose data need to be ``unlearned''. The proposed method is validated via performance comparisons with non-parametric schemes that train from scratch by excluding data to be forgotten, as well as with existing parametric Bayesian unlearning methods. Finally, conventional frequentist FL schemes are known to yield overconfident decisions. Bayesian FL addresses this issue by allowing agents to process and exchange uncertainty information encoded in distributions over the model parameters. However, this comes at the cost of a larger per-iteration communication overhead. This paper investigates whether Bayesian FL can still provide advantages in terms of calibration when constraining communication bandwidth. We present compressed particle-based Bayesian FL protocols for FL and federated ``unlearning" that apply quantization and sparsification across multiple particles. The experimental results confirm that the benefits of Bayesian FL are robust to bandwidth constraints.
Advisors
Kang, Joonhyukresearcher강준혁researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2023.2,[v, 84 p. :]

Keywords

Bayesian federated learning▼aMachine unlearning▼aExponential family▼aStein variational gradient descent▼aWireless communication▼aSparsification; 베이지안 연합 학습▼a비학습▼a지수족▼a스타인 변분 경사 하강법▼a무선통신▼a희소화

URI
http://hdl.handle.net/10203/309063
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1030546&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0