[ICML 2025] Federated Learning for Feature Generalization with Convex Constraints - Dongwon Kim, Donghee Kim, Sung Kuk Shyn, Kwnagsu Kim
- 인공지능 융합 연구실
- 조회수389
- 2025-06-09
Title:
Federated Learning for Feature Generalization with Convex Constraints
Author:
Dongwon Kim,
Donghee Kim,
Sung Kuk Shyn,
Kwangsu Kim
Abstract
Federated learning (FL) often struggles with generalization due to heterogeneous client data. Local models are prone to overfitting their local data distributions, and even transferable features can be distorted during aggregation. To address these challenges, we propose FedCONST, an approach that adaptively modulates update magnitudes based on the global model’s parameter strength. This prevents over-emphasizing welllearned parameters while reinforcing underdeveloped ones. Specifically, FedCONST employs linear convex constraints to ensure training stability and preserve locally learned generalization capabilities during aggregation. A Gradient Signal to Noise Ratio (GSNR) analysis further validates FedCONST’s effectiveness in enhancing feature transferability and robustness. As a result, FedCONST effectively aligns local and global objectives, mitigating overfitting and promoting stronger generalization across diverse FL environments, achieving state-of-the-art performance.



