Children's Rights, not Deceptive Patterns by Design: a Requirements Perspective
Resumo
In the digital age, children are frequent users of software platforms, yet they often lack awareness of various cyber risks such as theft, stalking, and harassment. These and other online issues are favoured by deceptive design patterns introduced by IT companies in their solutions, such as game apps and social media. These patterns may deceive or manipulate users into taking unintended actions, such as making purchases, signing up for services or leaving loose privacy definitions. This paper aims to understand the relationship between deceptive patterns and needed measures for a safe online experience for children. We accomplish this goal by analysing how deceptive design patterns affect a list of eighteen legal software requirements for children’s protection and well-being online, based on guidelines from UNICEF (The United Nations Children’s Fund). Our contribution is twofold: (i) describing how IT companies neglect such legal requirements and (ii) discussing essential safeguards by adapting or incorporating new features, presented here as prototypes. By investigating this relationship, we seek to provide insights into how to design software solutions that protect children and ensure their rights online
Referências
UN General Assembly. 1989. Convention on the rights of the child. United Nations, Treaty Series, 1577, 3, 1–23.
Pooria Babaei. 2024. Drivers and Persuasive Strategies to Influence User Intention to Learn About Manipulative Design. Ph.D. Dissertation. University of Saskatchewan Saskatoon.
Harry Brignull. 2023. Deceptive patterns: exposing the tricks tech companies use to control you.
Ann Cavoukian et al. 2009. Privacy by design: the 7 foundational principles. Information and privacy commissioner of Ontario, Canada, 5, 12.
Ishita Chordia, Lena-Phuong Tran, Tala June Tayebi, Emily Parrish, Sheena Erete, Jason Yip, and Alexis Hiniker. 2023. Deceptive design patterns in safety technologies: a case study of the citizen app. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–18.
Tommaso Crepax and Jan Tobias Mühlberg. 2022. Upgrading the protection of children from manipulative and addictive strategies in online games: legal and technical solutions beyond privacy regulation. arXiv preprint arXiv:2207.09928.
Juehee Dawson. 2024. Privacy-Enhanced Parenting Mediation System" ProKids" Providing Age-Appropriate Content with X. 509 Certificate Age Rating. Ph.D. Dissertation. Université d’Ottawa| University of Ottawa.
Jonathan F Geider. 2021. How to (not) exploit your internet child star: unregulated child labor on youtube, instagram and social media.
PAJ Graßl, HK Schraffenberger, FJ Zuiderveen Borgesius, and MA Buijzen. 2021. Dark and bright patterns in cookie consent requests.
Colin M Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L Toombs. 2018. The dark (patterns) side of ux design. In Proceedings of the 2018 CHI conference on human factors in computing systems, 1–14
JOHANNA GUNAWAN, AMOGH PRADEEP, DAVID CHOFFNES, WOODROW HARTZOG, and CHRISTO WILSON. 2021. A comparative study of dark paterns across mobile and web modalities.
Fernando Kamei et al. 2021. Grey literature in software engineering: a critical review. Information and Software Technology, 138, 106609.
J Kavenna. 2021. Surveillance capitalism is an assault on human autonomy (2019). (2021).
Dominique Kelly and Jacquelyn Burkell. 2023. Documenting privacy dark patterns: how social networking sites influence users’ privacy choices. FIMS Publications.
Dominique Kelly and Victoria L Rubin. 2022. Dark pattern typology: how do social networking sites deter disabling of user accounts. In The 12th International Conference on Social Media and Society. EasyChair.
Cherie Lacey and Catherine Caudwell. 2019. Cuteness as a ‘dark pattern’in home robots. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 374–381.
Luca Marelli and Giuseppe Testa. 2018. Scrutinizing the eu general data protection regulation. Science, 360, 6388, 496–498.
OECD. 2021. Children in the digital environment. 302. DOI: 10.1787/9b8f222e-en.
Vinícius Polito, George Valença, Maria Wanick Sarinho, Fernando Lins, and Rodrigo Pereira dos Santos. 2022. On the compliance of platforms with children’s privacy and protection requirements-an analysis of tiktok. In International Conference on Software Business. Springer, 85–100.
Jenny Radesky, Alexis Hiniker, Caroline McLaren, Eliz Akgun, Alexandria Schaller, Heidi M Weeks, Scott Campbell, and Ashley N Gearhardt. 2022. Prevalence and characteristics of manipulative design in mobile applications used by children. JAMA Network Open, 5, 6, e2217641–e2217641.
Hauke Sandhaus. 2023. Promoting bright patterns. arXiv preprint arXiv:2304.01157.
Elisabeth Staksrud, Kjartan Ólafsson, and Sonia Livingstone. 2013. Does the use of social networking sites increase children’s risk of harm? Computers in human behavior, 29, 1, 40–50.
Kaiwen Sun, Jingjie Li, Yixin Zou, Jenny Radesky, Christopher Brooks, and Florian Schaub. 2024. Unfulfilled promises of child safety and privacy: portrayals and use of children in smart home marketing. Proceedings of the ACM on Human-Computer Interaction, 8, CSCW1, 1–29.
Amina Tariq, Diego Muñoz Sáez, and Shanchita R Khan. 2022. Social media use and family connectedness: a systematic review of quantitative literature. New Media & Society, 24, 3, 815–832.
Engin Teymur. 2024. Motivation or Manipulation? Dark UX and Persuasive Elements in Mobile Game Advertisements. Ph.D. Dissertation. Universität Siegen.
Shoshana Zuboff. 2023. The age of surveillance capitalism. In Social theory re-wired. Routledge, 203–213.