Human-Oriented Software Engineering Experiments: The Large Gap in Experiment Reports

  • Larissa Falcão UFPE / UPE
  • Sergio Soares UFPE

Resumo


Context: The lack of information in experiment reports impairs external replications and decreases experiment quality and reliability, causing a lack of knowledge dissemination and making it impossible to confirm the results, despite the existence of well-known guidelines for planning, conducting, and reporting experiments. Objective: Provide an in-depth study on how information from human-oriented controlled experiments in software engineering is reported after the emergence of supporting guidelines. Method: A systematic mapping study was conducted in the main empirical software engineering and software engineering venues, considering the period following the supporting guidelines publication. Results: We analyzed 412 articles from three conferences and three journals reporting experiments where we did not find crucial information about the experiments in most of them. Examples of such information are participant reward, target population, hypothesis, and conclusion and construct validity. Conclusion: There is a gap between the information the guidelines suggest reporting and what is reported. From 27 elements that should be on the reports, according to the guidelines, 65% of the analyzed articles failed to report at least 13 (almost half). Such finding opposes the natural intuition that with the appearance and maturation of guidelines, studies’ reports would increasingly comply with them over the years. As a consequence, a flawed report may raise doubts about the quality and validity of the study.
Publicado
29/09/2021
Como Citar

Selecione um Formato
FALCÃO, Larissa; SOARES, Sergio. Human-Oriented Software Engineering Experiments: The Large Gap in Experiment Reports. In: SIMPÓSIO BRASILEIRO DE ENGENHARIA DE SOFTWARE (SBES), 35. , 2021, Joinville. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 .