ProxyFL: Decentralized Federated Learning through Proxy Model Sharing
Institutions in highly regulated domains such as finance and healthcare often
have restrictive rules around data sharing. Federated learning is a distributed
learning framework that enables multi-institutional collaborations on
decentralized data with improved protection for each collaborator's data
privacy. In this paper, we propose a communication-efficient scheme for
decentralized federated learning called ProxyFL, or proxy-based federated
learning. Each participant in ProxyFL maintains two models, a private model,
and a publicly shared proxy model designed to protect the participant's
privacy. Proxy models allow efficient information exchange among participants
using the PushSum method without the need of a centralized server. The proposed
method eliminates a significant limitation of canonical federated learning by
allowing model heterogeneity; each participant can have a private model with
any architecture. Furthermore, our protocol for communication by proxy leads to
stronger privacy guarantees using differential privacy analysis. Experiments on
popular image datasets, and a pan-cancer diagnostic problem using over 30,000
high-quality gigapixel histology whole slide images, show that ProxyFL can
outperform existing alternatives with much less communication overhead and
stronger privacy.
Authors
Shivam Kalra, Junfeng Wen, Jesse C. Cresswell, Maksims Volkovs, Hamid R. Tizhoosh