Helping Teachers Visualize Students' Performance
Resumo
The growth in the number of online courses evidences a new paradigm where education is available anywhere and anytime (and, hopefully, to any person). In this new paradigm, courses occur in online learning environments, which rely on information and communication technology to promote learning and teaching. However, researchers report that, on average, 85% of students drop out these courses, and they blame the lack of teacher's support. In this regard, it is necessary to help teachers analyze the data these environments generate, extracting relevant information to guide their decisions. Learning Analytics, Educational Data Mining and Data Visualization can be used to deal with these data, but training teachers on these techniques would demand them time and effort, and the effectiveness is unknown. We propose, instead, the use of Data Viz to help teachers aggregate and "see" their students according to their performance level. We asked teachers to interact with some visualizations. We then checked if they understood the information presented, and asked about their perceptions regarding the: utility, ease of use, attitude towards use, intention to use, aesthetics, the color scheme used and the vocabulary used. The results indicate teachers understood and had positive perceptions regarding the visualizations used.
Referências
Allen, I., Seaman, J., Poulin, R., and Straut, T. (2016). Online report card: Tracking online education in the United States. Babson Park, MA: Babson Survey Research Group and Quahog Research Group, LLC.
Baker, R., Duval, E., Stamper, J., Wiley, D., and BuckinghamShum, S. (2012). Panel: educational data mining meets learning analytics. In Proceedings of 2nd International Conference on Learning Analytics & Knowledge (LAK’12), New York, USA, page 20.
Bienkowski, M., Feng, M., and Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. US Department of Education, Office of Educational Technology, pages 1–57.
Bittencourt, I. I., Costa, E., Silva, M., and Soares, E. (2009). A computational model for developing semantic web-based educational systems. Knowledge-Based Systems, 22(4): 302–315.
Cohen, W. W. (1995). Fast effective rule induction. In Proceedings of the twelfth international conference on machine learning, pages 115–123.
Liyanagunawardena, T. R., Parslow, P., and Williams, S. (2014). Dropout: Mooc participants’ perspective.
Onah, D. F., Sinclair, J., and Boyatt, R. (2014). Dropout rates of massive open online courses: behavioural patterns. EDULEARN14 Proceedings, pages 5825–5834.
Romero, C. and Ventura, S. (2016). Educational data science in massive open online courses. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery.
Schildkamp, K., Lai, M. K., and Earl, L. (2012). Data-based decision making in education: Challenges and opportunities, volume 17. Springer Science & Business Media.
Sumner, M., Frank, E., and Hall, M. (2005). Speeding up logistic model tree induction. In European Conference on Principles of Data Mining and Knowledge Discovery, pages 675–683. Springer.
Teo, T. (2011). Factors influencing teachers’ intention to use technology: Model development and test. Computers & Education, 57(4): 2432–2440.
Teo, T., Wong, S. L., and Chai, C. S. (2008). Across-cultural examination of the intention to use technology between Singaporean and Malaysian pre-service teachers: An application of the technology acceptance model (tam). Educational Technology & Society, 11(4): 265–280.
