Federated Learning with AutoKeras and Knowledge Distillation
Abstract
Este artigo apresenta a técnica AFP-KD-AutoML com objetivo de reduzir o tempo de treinamento e execução de modelos para Aprendizado Federado. A técnica usa o conceito Knowledge Distillation para transferir informações entre clientes e servidor e a ferramenta AutoKeras para encontrar arquiteturas de redes neurais artificiais.References
Garg, A., Saha, A. K., and Dutta, D. (2020). Direct federated neural architecture search. arXiv preprint arXiv:2010.06223.
Zhang, J., Guo, S., Ma, X., Wang, H., Xu, W., and Wu, F. (2021). Parameterized knowledge transfer for personalized federated learning. Advances in Neural Information Processing Systems, 34:10092–10104.
Zhang, J., Guo, S., Ma, X., Wang, H., Xu, W., and Wu, F. (2021). Parameterized knowledge transfer for personalized federated learning. Advances in Neural Information Processing Systems, 34:10092–10104.
Published
2024-04-24
How to Cite
MEYER, Bruno H.; POZO, Aurora; NOGUEIRA, Michele; ZOLA, Wagner M. Nunan.
Federated Learning with AutoKeras and Knowledge Distillation. In: REGIONAL SCHOOL OF HIGH PERFORMANCE COMPUTING FROM SOUTHERN BRAZIL (ERAD-RS), 24. , 2024, Florianópolis/SC.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2024
.
p. 135-136.
ISSN 2595-4164.
DOI: https://doi.org/10.5753/eradrs.2024.238583.