Fedstellar: A Platform for Training Models in a Privacy-preserving and Decentralized Fashion

Fedstellar: A Platform for Training Models in a Privacy-preserving and Decentralized Fashion

Enrique Tomás Martínez Beltrán, Pedro Miguel Sánchez Sánchez, Sergio López Bernal, Gérôme Bovet, Manuel Gil Pérez, Gregorio Martínez Pérez, Alberto Huertas Celdrán

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Demo Track. Pages 7154-7157. https://doi.org/10.24963/ijcai.2023/838

This paper presents Fedstellar, a platform for training decentralized Federated Learning (FL) models in heterogeneous topologies in terms of the number of federation participants and their connections. Fedstellar allows users to build custom topologies, enabling them to control the aggregation of model parameters in a decentralized manner. The platform offers a Web application for creating, managing, and connecting nodes to ensure data privacy and provides tools to measure, monitor, and analyze the performance of the nodes. The paper describes the functionalities of Fedstellar and its potential applications. To demonstrate the applicability of the platform, different use cases are presented in which decentralized, semi-decentralized, and centralized architectures are compared in terms of model performance, convergence time, and network overhead when collaboratively classifying hand-written digits using the MNIST dataset.
Keywords:
Machine Learning: ML: Federated learning
Machine Learning: ML: Applications
Machine Learning: ML: Classification
Machine Learning: ML: Evaluation