Authors: Yasser H. Khalil, Amir H. Estiri, Mahdi Beitollahi, Nader Asadi, Sobhan Hemati, Xu Li, Guojun Zhang, Xi Chen
Published on: February 02, 2024
Impact Score: 8.07
Arxiv code: Arxiv:2402.01863
Summary
- What is new: A Decentralized Federated Mutual Learning (DFML) framework that supports heterogeneity in models and data without relying on centralized servers or public data.
- Why this is important: Centralized servers in Federated Learning face communication bottlenecks and single points of failure, while existing devices show model and data heterogeneity.
- What the research proposes: DFML, a serverless framework that supports nonrestrictive heterogeneous models and handles model and data heterogeneity through mutual learning without needing public data.
- Results: DFML achieves significant improvements in convergence speed and global accuracy, outperforming baselines with +17.20% and +19.95% increases in global accuracy for the CIFAR-100 dataset with 50 clients under IID and non-IID data shifts, respectively.
Technical Details
Technological frameworks used: Decentralized Federated Mutual Learning (DFML)
Models used: nan
Data used: CIFAR-100 dataset
Potential Impact
Companies specialising in decentralized computing, IoT devices, and mobile networks could benefit. Centralized cloud services might face disruption.
Want to implement this idea in a business?
We have generated a startup concept here: SynapSync.
Leave a Reply