How to Aggregate and Not Influence Models - Controlling Client Influence in Dynamic Federated Learning

Abstract


Distributed systems have proven to be an essential approach for machine learning, especially in scenarios with many connected devices, such as the Internet of Things (IoT) and Smart Cities. However, the availability of these devices is crucial for effective training. It cannot be guaranteed throughout the training process due to limitations such as battery life, bandwidth, or compliance requirements. To address these challenges, we propose FedPIPC. This aggregation method leverages client participation as an influencing factor in the global model, combined with a knowledge preservation mechanism to mitigate the impacts of participation fluctuations during federated training. Empirical results show that the proposed solution improves the balance between accuracy and data transmission by up to 50% and reduces transmission volume by up to 89%, demonstrating its effectiveness in dynamic scenarios.

Keywords: Big data and machine learning applications for computer networks and distributed systems, Performance, scalability, and reliability, Distributed Algorithms, Distributed and Networked Applications

References

Beutel, D. J., Topal, T., Mathur, A., Qiu, X., Fernandez-Marques, J., Gao, Y., Sani, L., Li, K. H., Parcollet, T., de Gusmão, P. P. B., and Lane, N. D. (2022). Flower: A friendly federated learning research framework.

Capanema, C. G. S., de Souza, A. M., da Costa, J. B. D., Silva, F. A., Villas, L. A., and Loureiro, A. A. F. (2025). A novel prediction technique for federated learning. IEEE Transactions on Emerging Topics in Computing, 13(1):5–21.

Cho, Y. J., Jhunjhunwala, D., Li, T., Smith, V., and Joshi, G. (2024). Maximizing global model appeal in federated learning. Transactions on Machine Learning Research.

Crawshaw, M. and Liu, M. (2024). Federated learning under periodic client participation and heterogeneous data: A new communication-efficient algorithm and analysis. In Advances in Neural Information Processing Systems.

Criado, M. F., Casado, F. E., Iglesias, R., Regueiro, C. V., and Barro, S. (2022). Non-iid data and continual learning processes in federated learning: A long road ahead. Information Fusion, 88:263–280.

De Lange, M., Aljundi, R., Masana, M., Parisot, S., Jia, X., Leonardis, A., Slabaugh, G., and Tuytelaars, T. (2022). A continual learning survey: Defying forgetting in classification tasks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(7):3366–3385.

de Souza, A. M., Maciel, F., da Costa, J. B., Bittencourt, L. F., Cerqueira, E., Loureiro, A. A., and Villas, L. A. (2024). Adaptive client selection with personalization for communication efficient federated learning. Ad Hoc Networks, 157:103462.

Gama, J. A., Žliobaitė, I., Bifet, A., Pechenizkiy, M., and Bouchachia, A. (2014). A survey on concept drift adaptation. ACM Computing Surveys, 46(4).

Jarczewski, R. O., Cerqueira, E., Bittencourt, L. F., Loureiro, A. A. F., Villas, L. A., and de Souza, A. M. (2024). Let’s federate - effective communication strategy for dynamic client participation. In 2024 International Conference on Machine Learning and Applications (ICMLA), pages 361–368.

Jee Cho, Y., Wang, J., and Joshi, G. (2022). Towards understanding biased client selection in federated learning. In Camps-Valls, G., Ruiz, F. J. R., and Valera, I., editors, Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, volume 151 of Proceedings of Machine Learning Research, pages 10351–10375. PMLR.

Kang, M., Kim, S., Jin, K. H., Adeli, E., Pohl, K. M., and Park, S. H. (2024). Fednn: Federated learning on concept drift data using weight and adaptive group normalizations. Pattern Recognition, 149:110230.

Li, T., Sahu, A. K., Zaheer, M., Sanjabi, M., Talwalkar, A., and Smith, V. (2020). Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems, 2:429–450.

McMahan, H. B., Moore, E., Ramage, D., and y Arcas, B. A. (2016). Federated learning of deep networks using model averaging. arXiv preprint arXiv:1602.05629, 2:2.

Reddy, K. D. and Gadekallu, T. R. (2023). A comprehensive survey on federated learning techniques for healthcare informatics. Computational Intelligence and Neuroscience, 2023:8393990.

Souza, A., Bittencourt, L., Cerqueira, E., Loureiro, A., and Villas, L. (2023). Dispositivos, eu escolho vocês: Seleção de clientes adaptativa para comunicação eficiente em aprendizado federado. In Anais do XLI Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos, pages 1–14, Porto Alegre, RS, Brasil. SBC.

Xiang, M., Ioannidis, S., Yeh, E., Joe-Wong, C., and Su, L. (2024). Efficient federated learning against heterogeneous and non-stationary client unavailability. In Advances in Neural Information Processing Systems.

Yan, Y., Niu, C., Ding, Y., Zheng, Z., Tang, S., Li, Q., Wu, F., Lyu, C., Feng, Y., and Chen, G. (2023). Federated optimization under intermittent client availability. INFORMS Journal on Computing, 36(1):185–202.

Yao, D., Zhu, Z., Liu, T., Xu, Z., and Jin, H. (2024). Rethinking personalized federated learning from a knowledge perspective. In Proceedings of the 53rd International Conference on Parallel Processing, pages 991–1000.

Zhu, H., Xu, J., Liu, S., and Jin, Y. (2021). Federated learning on non-iid data: A survey. Neurocomputing, 465:371–390.
Published
2025-05-19
JARCZEWSKI, Rafael O.; CERQUEIRA, Eduardo; BITTENCOURT, Luiz F.; LOUREIRO, Antonio A. F.; VILLAS, Leandro A.; DE SOUZA, Allan M.. How to Aggregate and Not Influence Models - Controlling Client Influence in Dynamic Federated Learning. In: BRAZILIAN SYMPOSIUM ON COMPUTER NETWORKS AND DISTRIBUTED SYSTEMS (SBRC), 43. , 2025, Natal/RN. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 420-433. ISSN 2177-9384. DOI: https://doi.org/10.5753/sbrc.2025.6189.

Most read articles by the same author(s)

1 2 3 4 5 6 7 8 9 10 > >>