ABSTRACT
Developers invest the cognitive effort to comprehend instructions in source code. Cognitive effort refers to the cognitive processing of a human being’s brain required to complete a cognitive task. The cognitive effort invested by developers can vary depending on the complexity of how instructions in source code are structured. To implement features, developers can write all instructions in a single method (non-modular) or even modularize it into several methods (modular). However, little is known about the effects of modularizing instructions in source code on the developers’ cognitive effort. Hence, adopting modularization practices ends up being a cognitive effort-insensitive task. This paper, therefore, reports on a controlled experiment that investigates the effects of modularization on the cognitive effort of developers while comprehending instructions in (non-)modular code. We evaluated the modularization of instructions with the participation of 35 developers who performed 10 comprehension tasks using a wearable EEG device. The main results suggest that developers tend to invest less cognitive effort to understand instructions in modular code rather than in non-modular code. However, developers spend more temporal effort to understand instructions in modular code, and this extra time is not converted into a higher rate of correct code comprehension. Our findings shed light on improving the state of the art of modularization practices by making them sensitive to the developers’cognitive effort.
- Igor Crk and Timothy Kluthe. 2016. Assessing the contribution of the individual alpha frequency (IAF) in an EEG-based study of program comprehension. In 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 4601–4604.Google ScholarCross Ref
- Igor Crk, Timothy Kluthe, and Andreas Stefik. 2016. Understanding Programming Expertise: An Empirical Study of Phasic Brain Wave Changes. ACM Trans. Comput.-Hum. Interact. 23, 1, Article 2 (Dec 2016), 29 pages.Google ScholarDigital Library
- B. Curtis, S.B. Sheppard, P. Milliman, M.A. Borst, and T. Love. 1979. Measuring the Psychological Complexity of Software Maintenance Tasks with the Halstead and McCabe Metrics. IEEE Transactions on Software Engineering SE-5, 2 (1979).Google Scholar
- Nicole Novielli Alexander SerebrenikMüller Sebastian C. Daniela Grassi, Filippo Lanubile. 2023. Towards Supporting Emotion Awareness in Retrospective Meetings. In Preprint - ICSE 2023 NIER - New Ideas and Emerging Results.Google Scholar
- Couceiro et al.2019. Spotting problematic code lines using nonintrusive programmers’ biofeedback. In 30th Int. Symposium on Software Reliability Engineering. 93–103.Google Scholar
- Medeiros et al.2021. Can EEG Be Adopted as a Neuroscience Reference for Assessing Software Programmers’ Cognitive Load?Sensors 21, 7 (2021).Google Scholar
- Thomas Fritz and Sebastian C Müller. 2016. Leveraging biometric data to boost software developer productivity. In 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering (SANER), Vol. 5. IEEE, 66–77.Google ScholarCross Ref
- Lucian José Gonçales, Kleinner Farias, and Bruno C. da Silva. 2021. Measuring the cognitive load of software developers: An extended Systematic Mapping Study. Information and Software Technology 136 (2021), 106563.Google ScholarDigital Library
- Ayaz Isazadeh, Habib Izadkhah, and Islam Elgedawy. 2017. Source code modularization: theory and techniques. Springer.Google Scholar
- Barbara A Kitchenham and Shari L Pfleeger. 2008. Personal opinion surveys. In Guide to advanced empirical software engineering. Springer, 63–92.Google Scholar
- Thomas Kosch, Jakob Karolus, Johannes Zagermann, Harald Reiterer, Albrecht Schmidt, and Paweł W. Woźniak. 2023. A Survey on Measuring Cognitive Workload in Human-Computer Interaction. ACM Comput. Surv. (Jan 2023).Google ScholarDigital Library
- T.J. McCabe. 1976. A Complexity Measure. IEEE Transactions on Software Engineering SE-2, 4 (1976), 308–320.Google ScholarDigital Library
- Julio et al. Medeiros. 2019. Software code complexity assessment using EEG features. In 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 1413–1416.Google Scholar
- Roberto Minelli, Andrea Mocci, and Michele Lanza. 2015. I Know What You Did Last Summer - An Investigation of How Developers Spend Their Time. 2015 IEEE 23rd International Conference on Program Comprehension (2015), 25–35.Google ScholarDigital Library
- Norman Peitek, Sven Apel, Chris Parnin, André Brechmann, and Janet Siegmund. 2021. Program comprehension and code complexity metrics: An fMRI study. In 43rd International Conference on Software Engineering.Google ScholarDigital Library
- Norman et al. Peitek. 2022. Correlates of programmer efficacy and their link to experience: a combined EEG and eye-tracking study. In 30th FSE. 120–131.Google Scholar
- Girish Maskeri Rama and Naineet Patel. 2010. Software modularization operators. In 2010 IEEE International Conference on Software Maintenance. 1–10.Google ScholarDigital Library
- Filippo Ricca, Massimiliano Di Penta, Marco Torchiano, Paolo Tonella, and Mariano Ceccato. 2010. How Developers’ Experience and Ability Influence Web Application Comprehension Tasks Supported by UML Stereotypes: A Series of Four Experiments. IEEE Transactions on Software Engineering 36, 1 (2010), 96–118.Google ScholarDigital Library
- Janet Siegmund. 2016. Program Comprehension: Past, Present, and Future. In 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), Vol. 5. 13–20.Google ScholarCross Ref
- Janet Siegmund, Norman Peitek, André Brechmann, Chris Parnin, and Sven Apel. 2020. Studying Programming in the Neuroage: Just a Crazy Idea?Commun. ACM 63, 6 (may 2020), 30–34.Google Scholar
- Janet et al. Siegmund. 2017. Measuring Neural Efficiency of Program Comprehension. In 11th Joint Meeting on Foundations of Software Engineering. 140–150.Google Scholar
- John Sweller. 2011. Cognitive load theory. In Psychology of learning and motivation. Vol. 55. Elsevier, 37–76.Google Scholar
- Rebecca Tiarks. 2011. What Programmers Really Do - An Observational Study. Softwaretechnik-Trends 31 (2011).Google Scholar
- Kurt Welker. 2001. Software Maintainability Index Revisited. J. Def. Softw. Eng none (08 2001).Google Scholar
- Claes Wohlin, Per Runeson, Martin Höst, Magnus C Ohlsson, Björn Regnell, and Anders Wesslén. 2012. Experimentation in software engineering. Springer Science & Business Media.Google ScholarCross Ref
- Claes Wohlin, Per Runeson, Martin Höst, Magnus C. Ohlsson, and Björn Regnell. 2012. Experimentation in Software Engineering.Springer. I–XXIII, 1–236 pages.Google ScholarCross Ref
- Xin Xia, Lingfeng Bao, David Lo, Zhenchang Xing, Ahmed E. Hassan, and Shanping Li. 2018. Measuring Program Comprehension: A Large-Scale Field Study with Professionals. IEEE Transactions on Software Engineering 44, 10 (2018).Google ScholarDigital Library
Index Terms
- Effects of Modularization on Developers' Cognitive Effort in Code Comprehension Tasks: A Controlled Experiment
Recommendations
Salient-class location: help developers understand code change in code review
ESEC/FSE 2018: Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software EngineeringCode review involves a significant amount of human effort to understand the code change, because the information required to inspect code changes may distribute across multiple files that reviewers are not familiar with. Code changes are often organized ...
Evaluation of machine learning techniques to classify code comprehension based on developers' EEG data
IHC '20: Proceedings of the 19th Brazilian Symposium on Human Factors in Computing SystemsPsychophysiological data such as brain waves have been used with machine learning techniques to classify the level of expertise and difficulty of software developers. However, little is known about the effectiveness of machine learning techniques (MLT) ...
Advancing our understanding and assessment of cognitive effort in the cognitive fit theory and data visualization context: Eye tracking-based approach
AbstractIn Cognitive Fit Theory (CFT) based research, there is a consensus about cognitive effort as the underlying mechanism impacting performance. Although critical to the theory, cognitive effort and its direct empirical assessment remain a ...
Highlights- We highlight the need in CFT research to focus on advancing our more direct assessment and measurement of cognitive effort
Comments