ActPerFL: Active personalized federated learning

Huili Chen & Jie Ding & Eric Tramel & Shuang Wu & Anit Kumar Sahu & Salman Avestimehr & Tao Zhang

Astract

In the context of personalized federated learning (FL), the critical challenge is to balance local model improvement and global model tuning when the personal and global objectives may not be exactly aligned. Inspired by Bayesian hierarchical models, we develop Act-PerFL, a self-aware personalized FL method where each client can automatically balance the training of its local personal model and the global model that implicitly contributes to other clients’ training. Such a balance is derived from the inter-client and intra-client uncertainty quantification. Consequently, Act-PerFL can adapt to the underlying clients’ heterogeneity with uncertainty-driven local training and model aggregation. With experimental studies on Sent140 and Amazon Alexa audio data, we show that ActPerFL can achieve superior personalization performance compared with the existing counterparts.

rss facebook x github gitlab youtube mail spotify lastfm instagram linkedin google google-plus pinterest medium googlescholar cv vimeo stackoverflow reddit quora quora