Towards a Framework Based on Open Science Practices for Promoting Reproducibility of Software Engineering Controlled Experiments

  • André F. R. Cordeiro State University of Maringá


Experimentation in Software Engineering has increased in the last decades as a way to provide evidence on theories and technologies. In a controlled experiment life cycle, several artifacts are used/reused and even produced. Such artifacts are mostly in the form of data, which should favor the reproducibility of such experiments. In this context, reproducibility can be defined as the ability to reproduce a study. Different benefits, such as methodology and data reuse, can be achieved from this ability. Despite the recognized benefits, several challenges have been faced by researchers regarding the experiments’ reproducibility capability. To overcome them, we understand that Open Science practices, related to provenance, preservation, and curation, might aid in improving such a capability. Therefore, in this paper, we present the proposal for an open science-based Framework to deal with controlled experiment research artifacts towards making such experiments de facto reproducible. To do so, different models associated with open science practices are planned to be integrated into the Framework.


Anchundia, C. E. et al. (2020). Resources for reproducibility of experiments in empirical software engineering: Topics derived from a secondary study. IEEE Access, 8:8992–9004.

Anda, B. C., Sjøberg, D. I., and Mockus, A. (2008). Variability and reproducibility in software engineering: A study of four companies that developed the same system. IEEE Transactions on Software Engineering, 35(3):407–429.

Basili, V. R., Zelkowitz, M. V., Sjøberg, D. I., Johnson, P., and Cowling, A. J. (2007). Protocols in the use of empirical software engineering artifacts. Empirical Software Engineering, 12(1):107–119.

Carver, J. C. (2010). Towards reporting guidelines for experimental replications: A proposal. In 1st international workshop on replication in empirical software engineering, volume 1, pages 1–4.

Cordeiro, A. F. and OliveiraJr, E. (2021). Open science practices for software engineering controlled experiments and quasi-experiments. In OpenScienSE, pages 19–21, Brazil. SBC.

Cordeiro, A. F., OliveiraJr, E., and Capretz, L. (2022). Towards an open science-based framework for software engineering controlled (quasi-)experiments. In Anais do XVI Brazilian e-Science Workshop, pages 57–64, Porto Alegre, RS, Brasil. SBC.

Cordeiro, A. F. R. (2022). An open science-based framework for managing experimental data in software engineering. In Proceedings of the International Conference on Evaluation and Assessment in Software Engineering 2022, pages 342–346.

Davis, F. D. (1985). A technology acceptance model for empirically testing new enduser information systems: Theory and results. PhD thesis, Massachusetts Institute of Technology.

Enríquez-Reyes, R., Cadena-Vela, S., Fuster-Guilló, A., Mazón, J.-N., Ibáñez, L. D., and Simperl, E. (2021). Systematic mapping of open data studies: Classification and trends from a technological perspective. IEEE Access, 9:12968–12988.

Furtado, V., OliveiraJr, E., and Kalinowski, M. (2021). Guidelines for promoting software product line experiments. In 15th Brazilian Symposium on Software Components, Architectures, and Reuse, pages 31–40.

González-Barahona, J. M. and Robles, G. (2012). On the reproducibility of empirical software engineering studies based on data retrieved from development repositories. Empirical Software Engineering, 17(1):75–89.

Immonen, A., Ovaska, E., and Paaso, T. (2018). Towards certified open data in digital service ecosystems. Software Quality Journal, 26(4):1257–1297.

Jedlitschka, A., Ciolkowski, M., and Pfahl, D. (2008). Reporting experiments in software engineering. In Guide to advanced empirical software engineering, pages 201–228. Springer, New York.

Kitchenham, B., Madeyski, L., and Brereton, P. (2020). Meta-analysis for families of experiments in software engineering: a systematic review and reproducibility and validity assessment. Empirical Software Engineering, 25:353–401.

Krishnamurthi, S. (2013). Artifact evaluation for software conferences. ACM SIGSOFT Software Engineering Notes, 38(3):7–10.

Li, Z. (2021). Stop building castles on a swamp! the crisis of reproducing automatic search in evidence-based software engineering. In 2021 IEEE/ACM 43rd International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER), pages 16–20. IEEE.

Liu, C., Gao, C., Xia, X., Lo, D., Grundy, J., and Yang, X. (2021). On the reproducibility and replicability of deep learning in software engineering. ACM Transactions on Software Engineering and Methodology (TOSEM), 31(1):1–46.

Mendez, D., Graziotin, D., Wagner, S., and Seibold, H. (2020). Open science in software engineering. In Contemporary empirical methods in software engineering, pages 477–501. Springer.

Neto, F. G. D. O., Torkar, R., and Machado, P. D. (2015). An initiative to improve reproducibility and empirical evaluation of software testing techniques. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, volume 2, pages 575–578. IEEE.

Osorio-Sanabria, M. A., Amaya-Fernández, F., and González-Zabala, M. (2020). Exploring the components of open data ecosystems: A systematic mapping study. In Proceedings of the 10th Euro-American conference on telematics and information systems, pages 1–6.

Pontika, N., Knoth, P., Cancellieri, M., and Pearce, S. (2015). Fostering open science to research using a taxonomy and an elearning portal. In i-KNOW, pages 1–8, New York. ACM.

Shull, F., Mendoncc¸a, M. G., Basili, V., Carver, J., Maldonado, J. C., Fabbri, S., Travassos, G. H., and Ferreira, M. C. (2004). Knowledge-sharing issues in experimental software engineering. Empirical Software Engineering, 9:111–137.

Shull, F., Singer, J., and Sjøberg, D. I. (2007). Guide to advanced empirical software engineering. Springer, New York.

Standard, I. I. (2017). International standard - systems and software engineering - vocabulary. ISO/IEC/IEEE 24765:2017(E), pages 1–541.

Vignando, H., Furtado, V. R., Teixeira, L. O., and OliveiraJr, E. (2020). Ontoexper-spl: An ontology for software product line experiments. In ICEIS (2), pages 401–408.

Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., and Wesslén, A. (2012). Experimentation in software engineering. Springer Science & Business Media, Germany.

Yuan, D., Yang, Y., and Chen, J. (2013). 2 - literature review. In Yuan, D., Yang, Y., and Chen, J., editors, Computation and Storage in the Cloud, pages 5–13. Elsevier.
CORDEIRO, André F. R.. Towards a Framework Based on Open Science Practices for Promoting Reproducibility of Software Engineering Controlled Experiments. In: CONGRESSO IBERO-AMERICANO EM ENGENHARIA DE SOFTWARE (CIBSE), 26. , 2023, Montevideo, Uruguai. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 237-244. DOI: