In this work, we develop new methods for Bayesian computation in a federated learning context (FL). While there are a variety of distributed MCMC algorithms, few have been developed to address the specific constraints of FL, such as data privacy, communication bottleneck, and statistical heterogeneity. To tackle these issues, we propose SALaD, a MCMC algorithm that combines the ideas of Stochastic Langevin Gradient Dynamics and Federated Averaging. In each round, each client executes SGLD to update its local parameter, which is sent to a central server. The central server then in turn sends the average of the local parameters to the clients. However, this method may suffer from the high variance of the stochastic gradients used by local SGLD and the heterogeneity of the data, which hinders and/or slows down convergence. To address these issues, we propose three alternatives based on a combination of control variates and bias reduction techniques for which theoretical improvements are derived. We illustrate our findings using several FL benchmarks for Bayesian inference.
Eric Moulines earned a degree in engineering (1984) from the Ecole Polytechnique, Paris, and a doctorate in electrical engineering (1990) from the Ecole Nationale Supérieure des Télécommunication. In 1990, he joined the Signal and Image Processing Department at Télécom ParisTech, where he was appointed full professor in 1996. In 2015, he joined the Centre for Applied Mathematics at Ecole Polytechnique, where he is currently Professor of Statistics.
His areas of expertise include computational statistics (Monte Carlo simulations, stochastic optimization), probabilistic machine learning, statistical signal processing and time series analysis (sequential Monte Carlo, non-linear filtering).
His current research topics include high-dimensional Monte Carlo sampling, stochastic optimization, and generative models (variational autoencoders, generative adversarial networks). He applies these different methods to uncertainty quantification, Bayesian inverse problems, and the control of complex systems.
He has published more than 120 articles in leading journals in the areas of signal processing, computational statistics and applied probability and more than 300 proceedings papers in major signal processing and machine learning conferences. In 1997 and 2006, he received the Best paper Award from the IEEE Signal Processing Society (for publications in IEEE Trans. On Signal Processing). He has served on the editorial boards of IEEE Trans. On Signal Processing, Signal Processing, Stochastic Processes and Applications, Journal of Statistical Planning and Inference, Electronic Journal of Statistics. From 2013-2016, he was the Editor-in-Chief of Bernoulli.
E. Moulines is a EURASIP and IMS Fellow. He was awarded the Silver Medal of the Centre National de Recherche Scientifique in 2010 and the Orange Prize of the French Academy of Sciences in 2011. In 2016, he was a Fellow of the IMS. In 2020, he received the technical achievement award from EURASIP. In 2017, he was elected to the Academy of Sciences.