The end of privacy by obscurity? Revisiting implicit privacy assumptions in the design of information systems

  • Andreis G. M. Purim UNICAMP
  • Heitor P. Nolla UNICAMP

Resumo


Many information systems (IS) operate under a de facto model of “privacy by obscurity,” in which users manage risk by limiting disclosure and adjusting settings. This model assumes that friction, legal authorization, and social norms constrain misuse. Contemporary data environments challenge this assumption. Large-scale aggregation, cross-context linkage, and reindentification (using machine learning models) reduce the protective value of limited visibility. This paper conceptualizes privacy by obscurity as an system design and identifies its core assumptions, examine the mechanisms through which it operates, and analyze the conditions under which it weakens. We then propose a taxonomy of privacy paradigms that compares obscurity with privacy by legality, architecture, market, and accountability approaches. The taxonomy clarifies how responsibility and enforcement are structured in contemporary information systems.

Referências

Acquisti, A., Brandimarte, L., and Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221):509–514.

Bamberger, K. A. and Mulligan, D. K. (2015). Privacy on the ground: driving corporate behavior in the United States and Europe. MIT Press.

Cate, F. (2006). The failure of fair information practice principles. Consumer Protection in the Age of the Information Economy.

de Montjoye, Y.-A., Hidalgo, C. A., Verleysen, M., and Blondel, V. D. (2013). Unique in the crowd: The privacy bounds of human mobility. Scientific Reports, 3(1).

Kosinski, M., Stillwell, D., and Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Sciences, 110(15):5802–5805.

Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., and Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11k shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW):1–32.

Narayanan, A. and Shmatikov, V. (2008). Robust de-anonymization of large sparse datasets. In 2008 IEEE Symposium on Security and Privacy (sp 2008), page 111–125. IEEE.

Ohm, P. (2009). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA l. Rev., 57:1701.

Solove, D. J. (2013). Introduction: Privacy self-management and the consent dilemma. Harvard law review, 126(7):1880–1903.

Sweeney, L. (2000). Simple demographics often identify people uniquely.

Zuboff, S. (2019). The age of surveillance capitalism. PublicAffairs, New York, NY.
Publicado
25/05/2026
PURIM, Andreis G. M.; NOLLA, Heitor P.. The end of privacy by obscurity? Revisiting implicit privacy assumptions in the design of information systems. In: TRILHA DE NOVAS IDEIAS E RESULTADOS EMERGENTES EM SI - POSICIONAMENTO DE IDEIAS - SIMPÓSIO BRASILEIRO DE SISTEMAS DE INFORMAÇÃO (SBSI), 22. , 2026, Vitória/ES. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2026 . p. 385-392. DOI: https://doi.org/10.5753/sbsi_estendido.2026.249126.