Conceptual Modeling of Algorithm Parameterization and Results Considering Volatile Data Requirements:A Case Study in the Context of Primary Healthcare
Resumo
Context: Conceptual modeling in software projects is often affected by volatile data requirements, which influence design, logic, and performance. Flexible data modeling is essential to adapt to these changes, especially in healthcare applications where diverse, evolving data types are common. Ensuring adaptability in these settings supports operational efficiency and timely decision-making. Problem: Frequent updates in the requirements of Primary Healthcare (PHC) systems increase maintenance costs and cause operational issues, especially when systems need to meet new legislative or operational demands. These changes can disrupt database schemas and impact software development costs. Proposed Solution: This paper introduces six schema alternatives designed to model algorithm parameters and outputs flexibly, reducing the impact of changing requirements on system structure. These adaptable schemas aim to minimize the need for frequent reconfigurations. IS Theory: The research is based on the Theory of Information Systems Flexibility, which focuses on creating adaptable data structures to ensure long-term efficiency in dynamic settings. This theoretical foundation aligns well with PHC’s evolving data needs, requiring resilient and adaptable systems. Method: A prescriptive approach was used, combining theoretical evaluation of schema options with a case study in a real-world PHC project. This mixed-method approach provided insights into schema flexibility and usability. Summary of Results:We concluded that the proposed database schema is highly effective for adapting to new data requirements, allowing algorithm data and results to be stored as JSON attributes. This flexibility in data inputs and outputs supports evolving PHC requirements. Contributions and Impact on IS: This study introduces a database schema for managing volatile data requirements in healthcare IS and reducing IS development costs.
Referências
Anthony Cleve, Maxime Gobert, Loup Meurice, Jerome Maes, and Jens Weber. 2015. Understanding database schema evolution: A case study. Science of Computer Programming 97 (2015), 113–121.
Carlo Curino, Hyun Jin Moon, Alin Deutsch, and Carlo Zaniolo. 2013. Automating the database schema evolution process. The VLDB Journal 22 (2013), 73–98.
Carlo A Curino, Hyun Jin Moon, Letizia Tanca, Carlo Zaniolo, et al. 2008. SCHEMA, EVOLUTION IN WIKIPEDIA Toward a Web Information System Benchmark. (2008).
Carlo A Curino, Hyun J Moon, and Carlo Zaniolo. 2008. Graceful database schema evolution: the prism workbench. Proceedings of the VLDB Endowment 1, 1 (2008), 761–772.
Julien Delplanque, Anne Etien, Nicolas Anquetil, and Stéphane Ducasse. 2020. Recommendations for evolving relational databases. In Advanced Information Systems Engineering: 32nd International Conference, CAiSE 2020, Grenoble, France, June 8–12, 2020, Proceedings 32. Springer, 498–514.
Andrew Gemino and Yair Wand. 2005. Complexity and clarity in conceptual modeling: Comparison of mandatory and optional properties. Data & Knowledge Engineering 55, 3 (2005), 301–326.
Marcela Genero, Geert Poels, and Mario Piattini. 2003. Defining and validating metrics for assessing the maintainability of entity-relationship diagrams. Faculteit Economie en Bedrijfskunde Hoveniersberg, Gent, Working Paper Series 11, 03 (2003), 199.
Marcela Genero, Geert Poels, and Mario Piattini. 2008. Defining and validating metrics for assessing the understandability of entity–relationship diagrams. Data & Knowledge Engineering 64, 3 (2008), 534–557.
Shuhao Li. 2022. MBench: a benchmark suite designed for database schema migration. Master’s thesis. University of Twente.
Loup Meurice, Csaba Nagy, and Anthony Cleve. 2016. Detecting and preventing program inconsistencies under database schema evolution. In 2016 IEEE International Conference on Software Quality, Reliability and Security (QRS). IEEE, 262–273.
Daniel L Moody. 2005. Theoretical and practical issues in evaluating the quality of conceptual models: current state and future directions. Data & Knowledge Engineering 55, 3 (2005), 243–276.
Daniel L Moody and Graeme G Shanks. 1994. What makes a good data model? Evaluating the quality of entity relationship models. In International Conference on Conceptual Modeling. Springer, 94–111.
Arlei José Calajans MORAES. 2009. AutonomousDB: uma ferramenta para propagação autônoma de atualizações de esquemas de dados. Master’s thesis. Universidade Federal de Pernambuco.
Pablo Suárez-Otero, Michael J Mior, Maria José Suárez-Cabal, and Javier Tuya. 2020. Maintaining nosql database quality during conceptual model evolution. In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2043–2048.