Authors: Xiaoxin Su, Yipeng Zhou, Laizhong Cui, Song Guo
Published on: February 06, 2024
Impact Score: 8.15
Arxiv code: Arxiv:2402.03815
Summary
- What is new: Introducing FediAC, an algorithm that utilizes programmable switch in federated learning for more efficient model update aggregation.
- Why this is important: The constraint of scarce memory space in programmable switches prevents efficient model aggregation in federated learning.
- What the research proposes: FediAC algorithm, which includes a client voting phase for determining significant model updates and a model aggregating phase that consumes less memory and communication traffic.
- Results: FediAC surpasses current state-of-the-art in model accuracy and reduces communication traffic.
Technical Details
Technological frameworks used: Federated Learning, Programmable Switch
Models used: FediAC algorithm
Data used: Public datasets
Potential Impact
Data privacy solutions, Cloud computing and Data Centers, Internet Service Providers
Want to implement this idea in a business?
We have generated a startup concept here: AgileNet.
Leave a Reply