11e Journée Statistique et Informatique pour la Science des Données à Paris-Saclay

Collection 11e Journée Statistique et Informatique pour la Science des Données à Paris-Saclay

Organizer(s) Avetik Karagulyan, Erwan Le Pennec
Date(s) 03/04/2026 - 03/04/2026
linked URL https://indico.math.cnrs.fr/event/16080/
00:00:00 / 00:00:00
5 6

Convergence and Linear Speed-Up in Stochastic Federated Learning

By Paul Mangold

In federated learning, multiple users collaboratively train a machine learning model without sharing local data. To reduce communication, users perform multiple local stochastic gradient steps that are then aggregated by a central server. However, due to data heterogeneity, local training introduces bias. In this talk, I will present a novel interpretation of the Federated Averaging algorithm, establishing its convergence to a stationary distribution. By analyzing this distribution, we show that the bias consists of two components: one due to heterogeneity and another due to gradient stochasticity. I will then extend this analysis to the Scaffold algorithm, demonstrating that it effectively mitigates heterogeneity bias but not stochasticity bias. Finally, we show that both algorithms achieve linear speed-up in the number of agents, a key property in federated stochastic optimization.

Information about the video

  • Date of recording 03/04/2026
  • Date of publication 13/04/2026
  • Institution IHES
  • Language English
  • Audience Researchers
  • Format MP4

Domain(s)

Last related questions on MathOverflow

You have to connect your Carmin.tv account with mathoverflow to add question

Ask a question on MathOverflow




Register

  • Bookmark videos
  • Add videos to see later &
    keep your browsing history
  • Comment with the scientific
    community
  • Get notification updates
    for your favorite subjects
Give feedback