Showing results 1 to 6 of 6
Accelerating Federated Learning with Split Learning on Locally Generated Losses HAN, DONGJUN; Moon, Jaekyun; Bhatti, Hasnain Irshad; Lee, Jungmoon, ICML 2021 Workshop on Federated Learning for User Privacy and Data Confidentiality, ICML Board, 2021-07-24 |
Election Coding for Distributed Learning: Protecting SignSGD against Byzantine Attacks Sohn, Jy Yong; Moon, Jaekyun; HAN, DONGJUN; Choi, Beongjun, 34th Conference on Neural Information Processing Systems (NeurIPS) 2020, Neural Information Processing Systems, 2020-12-08 |
Few-Round Learning for Federated Learning Park, Younghyun; Moon, Jaekyun; HAN, DONGJUN; Kim, Do-Yeon; Seo, Jun, Thirty-fifth Conference on Neural Information Processing Systems (NeurIPS), Neural Information Processing Systems Foundation, 2021-12-09 |
Handling Both Stragglers and Adversaries for Robust Federated Learning PARK, JUNGWUK; Moon, Jaekyun; HAN, DONGJUN; Choi, Minseok, ICML 2021 Workshop on Federated Learning for User Privacy and Data Confidentiality, ICML Board, 2021-07-24 |
Hierarchical Broadcast Coding: Expediting Distributed Learning at the Wireless Edge HAN, DONGJUN; Sohn, Jy Yong; Moon, Jaekyun, IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, v.20, no.4, pp.2266 - 2281, 2021-04 |
Sageflow: Robust Federated Learning against Both Stragglers and Adversaries PARK, JUNGWUK; Moon, Jaekyun; HAN, DONGJUN; Choi, Minseok, Thirty-fifth Conference on Neural Information Processing Systems (NeurIPS), Neural Information Processing Systems Foundation, 2021-12-08 |
Discover