PRISM: Privacy-Preserving Improved Stochastic Masking for Federated Generative Models

¹Ulsan National Institute of Science & Technology
²Department of Computer Science and Engineering, Yonsei University

*Indicates Co-corresponding author

Abstract

Despite recent advancements in federated learning (FL), the integration of generative models into FL has been limited due to challenges such as high communication costs and unstable training in heterogeneous data environments. To address these issues, we propose PRISM, a FL framework tailored for generative models that ensures (i) stable performance in heterogeneous data distributions and (ii) resource efficiency in terms of communication cost and final model size. The key of our method is to search for an optimal stochastic binary mask for a random network rather than updating the model weights, identifying a sparse subnetwork with high generative performance; i.e., a “strong lottery ticket”. By communicating binary masks in a stochastic manner, PRISM minimizes communication overhead. This approach, combined with the utilization of maximum mean discrepancy (MMD) loss and a mask-aware dynamic moving average aggregation method (MADA) on the server side, facilitates stable and strong generative capabilities by mitigating local divergence in FL scenarios. Moreover, thanks to its sparsifying characteristic, PRISM yields an lightweight model without extra pruning or quantization, making it ideal for environments such as edge devices. Experiments on MNIST, FMNIST, CelebA, and CIFAR10 demonstrate that PRISM outperforms existing methods, while maintaining privacy with minimal communication costs. PRISM is the first to successfully generate images under challenging non-IID and privacy-preserving FL environments on complex datasets, where previous methods have struggled. Our code is available at PRISM.

Experimental results

Comparison of FID, communication cost, and storage

FID_params

Non-IID and DP case

Quantitative Results of baselines and PRISM with privacy budget $(\epsilon, \delta)=(9.8, 10^{-5})$ Quantitative Results of baselines and PRISM with privacy budget $(\epsilon, \delta)=(9.8, 10^{-5})$

Non-IID and No-DP case

Quantitative Results of baselines and PRISM with privacy budget $(\epsilon, \delta)=(9.8, 10^{-5})$ Quantitative Results of baselines and PRISM with privacy budget $(\epsilon, \delta)=(9.8, 10^{-5})$

Local divergence \( \Delta_t \) and FID values

BibTeX


        @article{seo2025prism,
        title={PRISM: Privacy-Preserving Improved Stochastic Masking for Federated Generative Models},
        author={Seo, Kyeongkook and Han, Dong-Jun and Yoo, Jaejun},
        journal={arXiv preprint arXiv:2503.08085},
        year={2025}}
      
-->