As Causas Sistêmicas por trás do Viés de Gênero em IA: Um Mapeamento Sistemático da Literatura
Resumo
A Inteligência Artificial (IA) tem transformado o cotidiano das pessoas, impactando decisões críticas em diversas áreas, como saúde, segurança e mercado de trabalho. Contudo, seu uso levanta preocupações éticas e sociais, especialmente no que diz respeito aos vieses presentes em sistemas de IA. Este trabalho aborda especificamente o viés de gênero, investigando as causas sistêmicas (dados, viés algorítmico e comportamento do usuário) por trás desse problema por meio de um Mapeamento Sistemático da Literatura (MSL). A pesquisa analisou 176 estudos provenientes das bases IEEE Xplore, ACM Digital Library e Scopus, dos quais 12 foram considerados relevantes para o objetivo do trabalho. A seleção foi orientada por critérios de inclusão e exclusão. A análise identificou que a maioria dos artigos encontrados aborda o uso de dados enviesados como causa do viés de gênero, seguido do viés algorítmico, encontrado em metade dos estudos. O comportamento do usuário ainda é pouco explorado, aparecendo como foco principal em apenas um trabalho.Referências
Brunet, M.-E., Alkalay-Houlihan, C., Anderson, A., and Zemel, R. (2019). Understanding the origins of bias in word embeddings. In Proceedings of the 36th International Conference on Machine Learning (ICML 2019), pages 1275–1294, Long Beach, California, USA. PMLR.
Cheong, J., Kuzucu, S., Kalkan, S., and Gunes, H. (2023). Towards gender fairness for mental health prediction. In Elkind, E., editor, Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pages 5932–5940. International Joint Conferences on Artificial Intelligence Organization. AI for Good.
Cho, W. I., Kim, J., Yang, J., and Kim, N. S. (2021). Towards cross-lingual generalization of translation gender bias. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, FAccT ’21, page 449–457, New York, NY, USA. Association for Computing Machinery.
da Silva Souza, N. C. (2023). Uma abordagem para identificação do viés de gênero em modelos de pln. Trabalho de Conclusão de Curso (MBA) -– Instituto de Ciências Matemáticas e de Computação, Universidade de São Paulo, São Carlos, 2023.
Dervişoğlu, H. and Fatih Amasyali, M. (2021). Bias detection and mitigation in sentiment analysis. In 2021 Innovations in Intelligent Systems and Applications Conference (ASYU), pages 1–6.
Ghosh, S. and Caliskan, A. (2023). Chatgpt perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across bengali and five other low-resource languages. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society, AIES ’23, page 901–912, New York, NY, USA. Association for Computing Machinery.
GLOBO (2024). Na educação, na saúde e até dentro de casa: como a inteligência artificial já faz parte da nossa rotina — g1.globo.com. [link]. [Acessado em 03/02/2025].
Hall, P. and Ellis, D. (2023). A systematic review of socio-technical gender bias in ai algorithms. Online Information Review, 47(7):1264 – 1279.
Kopeinik, S., Mara, M., Ratz, L., Krieg, K., Schedl, M., and Rekabsaz, N. (2023). Show me a ”male nurse”! how gender bias is reflected in the query formulation of search engine users. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, CHI ’23, New York, NY, USA. Association for Computing Machinery.
Lima, R. M. d., Pisker, B., and Correa, V. S. (2023). Journal of Telecommunications and the Digital Economy, 11(2):8–30.
Matthews, S., Hudzina, J., and Sepehr, D. (2022). Gender and racial stereotype detection in legal opinion word embeddings. In Proceedings of the 36th AAAI Conference on Artificial Intelligence (AAAI 2022), volume 36, pages 12026–12033, Vancouver, Canada. AAAI Press.
Nadeem, A., Abedin, B., and Marjanovic, O. (2020). Gender bias in ai: A review of contributing factors and mitigating strategies. In ACIS 2020 Proceedings - 31st Australasian Conference on Information Systems.
Nadeem, A., Marjanovic, O., and Abedin, B. (2022). Gender bias in ai-based decision-making systems: a systematic literature review. Australasian Journal of Information Systems, 26. All Open Access, Gold Open Access.
Njoto, S., Cheong, M., Lederman, R., McLoughney, A., Ruppanner, L., and Wirth, A. (2022a). Gender bias in ai recruitment systems: A sociological-and data science-based case study. In 2022 IEEE International Symposium on Technology and Society (ISTAS), volume 1, pages 1–7.
Njoto, S., Cheong, M., Lederman, R., McLoughney, A., Ruppanner, L., and Wirth, A. (2022b). Gender bias in ai recruitment systems: A sociological-and data science-based case study. In 2022 IEEE International Symposium on Technology and Society (ISTAS), volume 1, pages 1–7.
Parreira, M. T., Gillet, S., Winkle, K., and Leite, I. (2023). How did we miss this? a case study on unintended biases in robot social behavior. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’23, page 11–20, New York, NY, USA. Association for Computing Machinery.
Rizhinashvili, D., Sham, A. H., and Anbarjafari, G. (2022). Gender neutralisation for unbiased speech synthesising. Electronics (Switzerland), 11(10). All Open Access, Gold Open Access.
Sogancioglu, G., Kaya, H., and Salah, A. A. (2023). The effects of gender bias in word embeddings on patient phenotyping in the mental health domain. In 2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII), pages 1–8.
Thelwall, M. (2018). Gender bias in machine learning for sentiment analysis. Online Information Review, 42(3):343 – 354. All Open Access, Green Open Access.
Cheong, J., Kuzucu, S., Kalkan, S., and Gunes, H. (2023). Towards gender fairness for mental health prediction. In Elkind, E., editor, Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pages 5932–5940. International Joint Conferences on Artificial Intelligence Organization. AI for Good.
Cho, W. I., Kim, J., Yang, J., and Kim, N. S. (2021). Towards cross-lingual generalization of translation gender bias. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, FAccT ’21, page 449–457, New York, NY, USA. Association for Computing Machinery.
da Silva Souza, N. C. (2023). Uma abordagem para identificação do viés de gênero em modelos de pln. Trabalho de Conclusão de Curso (MBA) -– Instituto de Ciências Matemáticas e de Computação, Universidade de São Paulo, São Carlos, 2023.
Dervişoğlu, H. and Fatih Amasyali, M. (2021). Bias detection and mitigation in sentiment analysis. In 2021 Innovations in Intelligent Systems and Applications Conference (ASYU), pages 1–6.
Ghosh, S. and Caliskan, A. (2023). Chatgpt perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across bengali and five other low-resource languages. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society, AIES ’23, page 901–912, New York, NY, USA. Association for Computing Machinery.
GLOBO (2024). Na educação, na saúde e até dentro de casa: como a inteligência artificial já faz parte da nossa rotina — g1.globo.com. [link]. [Acessado em 03/02/2025].
Hall, P. and Ellis, D. (2023). A systematic review of socio-technical gender bias in ai algorithms. Online Information Review, 47(7):1264 – 1279.
Kopeinik, S., Mara, M., Ratz, L., Krieg, K., Schedl, M., and Rekabsaz, N. (2023). Show me a ”male nurse”! how gender bias is reflected in the query formulation of search engine users. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, CHI ’23, New York, NY, USA. Association for Computing Machinery.
Lima, R. M. d., Pisker, B., and Correa, V. S. (2023). Journal of Telecommunications and the Digital Economy, 11(2):8–30.
Matthews, S., Hudzina, J., and Sepehr, D. (2022). Gender and racial stereotype detection in legal opinion word embeddings. In Proceedings of the 36th AAAI Conference on Artificial Intelligence (AAAI 2022), volume 36, pages 12026–12033, Vancouver, Canada. AAAI Press.
Nadeem, A., Abedin, B., and Marjanovic, O. (2020). Gender bias in ai: A review of contributing factors and mitigating strategies. In ACIS 2020 Proceedings - 31st Australasian Conference on Information Systems.
Nadeem, A., Marjanovic, O., and Abedin, B. (2022). Gender bias in ai-based decision-making systems: a systematic literature review. Australasian Journal of Information Systems, 26. All Open Access, Gold Open Access.
Njoto, S., Cheong, M., Lederman, R., McLoughney, A., Ruppanner, L., and Wirth, A. (2022a). Gender bias in ai recruitment systems: A sociological-and data science-based case study. In 2022 IEEE International Symposium on Technology and Society (ISTAS), volume 1, pages 1–7.
Njoto, S., Cheong, M., Lederman, R., McLoughney, A., Ruppanner, L., and Wirth, A. (2022b). Gender bias in ai recruitment systems: A sociological-and data science-based case study. In 2022 IEEE International Symposium on Technology and Society (ISTAS), volume 1, pages 1–7.
Parreira, M. T., Gillet, S., Winkle, K., and Leite, I. (2023). How did we miss this? a case study on unintended biases in robot social behavior. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’23, page 11–20, New York, NY, USA. Association for Computing Machinery.
Rizhinashvili, D., Sham, A. H., and Anbarjafari, G. (2022). Gender neutralisation for unbiased speech synthesising. Electronics (Switzerland), 11(10). All Open Access, Gold Open Access.
Sogancioglu, G., Kaya, H., and Salah, A. A. (2023). The effects of gender bias in word embeddings on patient phenotyping in the mental health domain. In 2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII), pages 1–8.
Thelwall, M. (2018). Gender bias in machine learning for sentiment analysis. Online Information Review, 42(3):343 – 354. All Open Access, Green Open Access.
Publicado
20/07/2025
Como Citar
DOLABELLA, Rafaela Toledo; SILVA, Thais Regina de Moura Braga; BRAGA E SILVA, Gláucia; BATISTA, Estela Miranda.
As Causas Sistêmicas por trás do Viés de Gênero em IA: Um Mapeamento Sistemático da Literatura. In: WOMEN IN INFORMATION TECHNOLOGY (WIT), 19. , 2025, Maceió/AL.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2025
.
p. 252-263.
ISSN 2763-8626.
DOI: https://doi.org/10.5753/wit.2025.9141.
