Skip to main content

Research Repository

Advanced Search

Migrating Models: A Decentralized View on Federated Learning

Kiss, Péter; Horváth, Tomáš

Authors

Péter Kiss

Tomáš Horváth



Abstract

Federated learning (FL) researches attempt to alleviate the increasing difficulty of training machine learning models, when the training data is generated in a massively distributed way. The key idea behind these methods is moving the training to locations of data generation, and periodically collecting and redistributing the model updates. We present our approach for transforming the general training algorithm of FL into a peer-to-peer-like process. Our experiments on baseline image classification datasets show that omitting central coordination in FL is feasible.

Citation

Kiss, P., & Horváth, T. (2021, September). Migrating Models: A Decentralized View on Federated Learning. Presented at ECML PKDD 2021, Online

Presentation Conference Type Conference Paper (Published)
Conference Name ECML PKDD 2021
Start Date Sep 13, 2021
End Date Sep 17, 2021
Online Publication Date Feb 17, 2022
Publication Date 2021
Deposit Date Apr 8, 2024
Publisher Springer
Pages 177-191
Series Title Communications in Computer and Information Science
Series Number 1524
Series ISSN 1865-0929
Book Title Machine Learning and Principles and Practice of Knowledge Discovery in Databases: International Workshops of ECML PKDD 2021, Virtual Event, September 13-17, 2021, Proceedings, Part I
ISBN 9783030937355
DOI https://doi.org/10.1007/978-3-030-93736-2_15
Keywords Federated learning, Peer-to-peer, Neural networks
Public URL http://researchrepository.napier.ac.uk/Output/3587425
Related Public URLs https://ecmlpkdd.org/2021/