Existem concordância e confiabilidade na avaliação da criatividade de resultados tangíveis da aprendizagem de computação na Educação Básica?

Resumo


À medida que uma sociedade cada vez mais globalizada cria economias baseadas no conhecimento, a necessidade de promover a criatividade nas escolas se intensifica. Embora a criatividade seja tipicamente abordada nas artes, literatura ou música, o ensino de computação também pode ser uma alternativa, especialmente ao estimular a criação de novos artefatos de software. Atualmente existe uma vasta pesquisa sobre criatividade, no entanto, a pesquisa para avaliar o grau de criatividade de produtos de software, criados como resultados tangíveis de aprendizagem, é pouco explorada. Ainda assim, como a pesquisa em geral sobre criatividade indica, uma questão central é: é possível fornecer avaliações sobre o grau de criatividade de aplicativos móveis desenvolvidos como resultados tangíveis de aprendizagem no ensino de computação na Educação Básica com concordância e confiabilidade? Assim, este artigo relata os resultados de uma análise de avaliações de 24 avaliadores sobre 10 aplicativos criados com App Inventor com relação à concordância e confiabilidade entre avaliadores. Além disso, também é discutido se tipos específicos de expertise impactam a percepção da criatividade. Resultados indicam ausência de concordância e confiabilidade entre avaliadores, e apontam para a importância do treinamento do avaliador em criatividade e/ou suporte automatizado de avaliação para obtenção de resultados mais confiáveis.
Palavras-chave: Criatividade, avaliação, concordância, confiabilidade

Referências

A. Aldave, J. M. Vara, D. Granada, E. Marcos, 2019. Leveraging creativity in requirements elicitation within agile software development: a systematic literature review. Journal of Systems and Software, 157, 110396.


R. F. Adler, K. Beck, 2020. Developing an Introductory Computer Science Course for Pre-service Teachers. Journal of Technology and Teacher Education, 28(3), 519-541.


N. da C. Alves, C. Gresse von Wangenheim, P. E. Rodrigues, J. C. R. Hauck, A. F. Borgatto, 2016. Ensino de Computação de Forma Multidisciplinar em Disciplinas de História no Ensino Fundamental – Um Estudo de Caso. Revista Brasileira de Informática na Educação, 24(3).


T. M. Amabile, 1982. Social psychology of creativity: A Consensual Assessment Technique. Journal of Personality and Social Psychology, 43, 997-1013.


J. Baer, J. C. Kaufman, C. A. Gentile, 2004. Extension of the Consensual Assessment Technique to Nonparallel Creative Products. Creativity Research Journal, 16:1, 113-117.


B. Barbot, R. W. Hass, R. Reiter-Palmon, 2019. Creativity assessment in psychological research: (Re)setting the standards. Psychology of Aesthetics, Creativity, and the Arts, 13(2), 233–240.


R. Beaty, D. R. Johnson, 2020. Automating Creativity Assessment with SemDis: An Open Platform for Computing Semantic Distance. PsyArXiv. Disponível em: https://doi.org/10.31234/osf.io/nwvps


S. P. Besemer, K. O’Quin, 1986. Analyzing creative products: Refinement and test of a judging instrument. Journal of Creative Behavior, 20(2), 115–126.


S. P. Besemer (1998). Creative product analysis matrix: Testing the model structure and a comparison among products—three novel chairs. Creativity Research Journal, 11(4), 333–346.


S. Besemer, D. J. Treffinger, (1981). Analysis of Creative Products: Review and Synthesis. Journal of Creative Behavior, 15(3), 158-178.


B. Bolden, C. DeLuca, T. Kukkonen, S. Roy, J. Wearing, 2019. Assessment of Creativity in K‐12 Education: A Scoping Review. Review of Education.


J. K. Buelin-Biesecker, E. N. Weibe, 2013. Can pedagogical strategies affect students’ creativity? Testing a choice-based approach to design and problem-solving in technology, design, and engineering education. Proc. of the American Society for Engineering Education Annual Conference & Exposition, Atlanta, GA.


L. J. Cronbach. 1951. Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.


D. H. Cropley, A. J. Cropley, 2010. Recognizing and fostering creativity in design education. International Journal of Technology and Design Education, 20, 345-358.


G. M. Cseh, K. K. Jeffries, 2019. A scattered CAT: A critical evaluation of the consensual assessment technique for creativity research. Psychology of Aesthetics, Creativity, and the Arts, 13(2), 159–166.


J. Diedrich, M. Benedek, E. Jauk, A. Neubauer, 2015. Are Creative Ideas Novel and Useful?. Psychology of Aesthetics, Creativity, and the Arts. 9. 35-40. doi:10.1037/a0038688.


K. El Emam, L. Briand, R. Smith, 1996. Assessor agreement in rating SPICE processes. Software Process: Improvement and Practice, 2(4).


J. P. Fatt, 2000. Fostering creativity in education. Education, 120 (4), 744–757.


M. N. F. Ferreira, C. Gresse von Wangenheim, R. Missfeldt Filho, F. da Cruz Pinheiro, J. C. R. Hauck, 2019. Learning user interface design and the development of mobile applications in middle school. ACM Interactions, 26(4).


J. L. Fleiss, M. C. Paik, B. Levin. 2003. Statistical Methods for Rates and Proportions. 3rd ed. John Wiley; Sons, Inc.


B. Forthmann, H. Holling, N. Zandi, A. Gerwig, P. Çelik, M. Storme, T. Lubart, 2017. Missing creativity: The effect of cognitive workload on rater (dis-)agreement in subjective divergent-thinking scores, Thinking Skills and Creativity, v. 23, pp. 129-139.


M. Graham, A. Milanowski, J. Miller, 2012. Measuring and Promoting Inter-Rater Agreement of Teacher and Principal Performance Ratings. Center for Education Compensation Reform. Online Submission.


C. Gresse von Wangenheim, J. V. Araujo Porto, J. C.R. Hauck, A. F. Borgatto, 2018. Do we agree on user interface aesthetics of Android apps? Available at: arXiv:1812.09049 [cs.SE], 2018.


H. C. Hill, C. Y. Charalambous, M. A. Kraft, 2012. When rater reliability is not enough: Teacher observation systems and a case for the G-study. Educational Researcher, 41(2), p. 56-64.


A. D. Ho, T. J. Kane, 2013. The reliability of classroom observations by school personnel. Research Paper. MET Project. Bill & Melinda Gates Foundation.


K. A. Hallgren, 2012. Computing inter-rater reliability for observational data: an overview and tutorial. Tutorials in quantitative methods for psychology, 8(1).


P. W. Jackson, S. Messick, 1964. The Person, The Product, and the Response: Conceptual Problems in the Assessment of Creativity. ETS Research Bulletin Series, 2, i-27.


J. C. Kaufman, J. Baer, J. C. Cole, J. D. Sexton, 2008. A comparison of expert and nonexpert raters using the consensual assessment technique. Creativity Research Journal, 20 (2), 171–178. doi:10.1080/10400410802059929


J. C. Kaufman, J. Baer, 2005. The amusement park theory of creativity. In J. C. Kaufman & J. Baer (Eds.), Creativity across domains: Faces of the muse, 321-328. Hillsdale, NJ: Lawrence Erlbaum Associates.


J. C. Kaufman, J. Baer (2012). Beyond new and appropriate: who decides what is creative? Creativity Research Journal, 24, 83–91. doi:10.1080/10400419.2012.649237


J. C. Kaufman, Lauren E. Skidmore, 2010. Taking the Propulsion Model of Creative Contributions into the 21st Century. Psychologie in Österreich, 5.


S. A. Kornilov, T. V. Kornilova, E. L. Grigorenko, 2016. The cross-cultural invariance of creative cognition: A case study of creative writing in U.S. and Russian college students. New Directions for Child and Adolescent Development, 151 (2016), 47-59.


R. S. Kramer, M. Mileva, K. L. Ritchie, 2018. Inter-rater agreement in trait judgements from faces. PloS one, 13(8), e0202655.


K. Krippendorff, 2016. Misunderstanding reliability. Methodology, 12(4), 139-144.


D. J. Krus, G. C. Helmstadter, 1993. The problem of negative reliabilities. Educational and Psychological Measurement, 53, 643-650.


I. Lee, F. Martin, J. Denner, B. Coulter, W. Allan, J. Erickson, J. Malyn-Smith, L. Werner, 2011. Computational thinking for youth in practice. ACM Inroads 2(1)(March 2011), 32–37.


S. C. Liao, E. A. Hunt, W. Chen, 2010. Comparison between inter-rater reliability and inter-rater agreement in performance assessment. Annals Academy of Medicine Singapore, 39(8), 613.


R. Litchfield, 2008. Brainstorming Reconsidered: A Goal-Based View. The Academy of Management Review, 33, 649-668.


R. Mohanani, P. Ram, A. Lasisi, P. Ralph and B. Turhan, 2017. Perceptions of Creativity in Software Engineering Research and Practice In: 43rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), Vienna, 2017, 210-217. doi: 10.1109/SEAA.2017.21.


D. R. Mullet, A. Willerson, K. N. Lamb, T. Kettler, 2016. Examining teacher perceptions of creativity: A systematic review of the literature. Thinking Skills and Creativity, 21, 9-30.


J. D. Novak, 2013. Meaningful learning is the foundation for creativity. Qurriculum-Revista de Teoría, Investigación y Práctica Educativa, 26, 27-38.


K. O'Quin, 1987. Creative product scale: applications in product improvement. In Creativity and Innovation: towards a European Network. Report of the First European Conference on Creativity and Innovation, 'Network in Action', organized by the Netherlands Organization for Applied Scientific Research TNO Delft, The Netherlands.


M. Resnick, K. Robinson. 2017. Lifelong kindergarten: Cultivating creativity through projects, passion, peers, and play. MIT press.


J. Pétervári, 2018. The evaluation of creative ideas–analysing the differences between expert and novice judges (Doctoral dissertation, Queen Mary University of London).


S. M. Reis, J. S. Renzulli, 1991. The assessment of creative products in programs for gifted and talented students. Gifted Child Quarterly, 35(3), 128–134.


M. Rhodes, 1961. An Analysis of Creativity. The Phi Delta Kappan, 42(7), 305-310.


M. Romero, A. Lepage, B. Lille, 2017. Computational thinking development through creative programming in higher education. International Journal of Educational Technology in Higher Education, 14(1), 42.


S. Said-Metwaly, E. Kyndt, W. Van den Noortgate, 2017. Approaches to Measuring Creativity: A Systematic Literature Review. Creativity - Theories - Research - Applications, 4(2).


C. L. Semmelroth, E. Johnson, 2014. Measuring inter rater reliability on a special education observation tool. Assessment for Effective Intervention, 39(9), 131-145.


J. S. Smith, 2016. Assessing Creativity: Creating a Rubric to Effectively Evaluate Mediated Digital Portfolios. Journalism & Mass Communication Educator, 72(1), 24-36.


R. J. Sternberg, 2012. The assessment of creativity: An investment-based approach Creativity Research Journal, 24 (2012), 3-12.


R. J. Sternberg, 2018. What's wrong with creativity testing?. The Journal of Creative Behavior, 54(1), 20-36.


R. J. Sternberg, J. Kaufman, J. Pretz, 2004. A Propulsion Model of Creative Leadership. Creativity and Innovation Management, 13, 145-153.


I. A. Taylor, B. J. Sandler, 1972. Use of a creative product inventory for evaluating products of chemists. In Proceedings of the 80th Annual Convention of the American Psychological Association, 7, 311–312.


M. Tissenbaum, J. Sheldon, H. Abelson, 2019. From Computational Thinking to Computational Action. Comm. of the ACM, 62(3), 34-36.


S. M. Todd, S. Shinzato, 1999. Thinking for the future: Developing higher level thinking and creativity for students in Japan—and elsewhere. Childhood Education, 75 (6), 342–345. doi:10.1080/00094056.1999.10522054


C. Walia, 2019. A Dynamic Definition of Creativity. Creativity Research Journal, 31(3), 237-247.


K. L. Westberg, 1996. The Effects of Teaching Students How to Invent. The Journal of Creative Behavior, 30, 249-267.


A. Yadav, S. Cooper, 2017. Fostering Creativity Through Computing. Comm. of the ACM, 60(2), 31-33.


J. Zhou, X. Wang, J. Song, J. Wu, 2017. Is It New? Personal and Contextual Influences on Perceptions of Novelty and Creativity. Journal of Applied Psychology, 102.


R. A. Beghetto. 2019. Large-Scale Assessments, Personalized Learning, and Creativity: Paradoxes and Possibilities. ECNU Review of Education, 2(3), 311-327.
Publicado
26/04/2021
DA CRUZ ALVES, Nathalia; VON WANGENHEIM, Christiane Gresse; MARTINS-PACHECO, Lúcia Helena; FERRETI BORGATTO, Adriano. Existem concordância e confiabilidade na avaliação da criatividade de resultados tangíveis da aprendizagem de computação na Educação Básica?. In: SIMPÓSIO BRASILEIRO DE EDUCAÇÃO EM COMPUTAÇÃO (EDUCOMP), 1. , 2021, On-line. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . p. 12-22. DOI: https://doi.org/10.5753/educomp.2021.14467.