Uma Proposta para Automatização de Avaliação de Usabilidade/UX

  • Luciano A. Garcia Universidade de São Paulo (USP)
  • Marcelo Morandini Universidade de São Paulo (USP)
  • Edson Oliveira Jr Universidade Estadual de Maringá (UEM)

Resumo


Este artigo objetiva apresenta um ambiente de software que melhore a automatização da avaliação de usabilidade e UX. Atualmente, esse processo, baseia-se em observações e impressõees, quase que exclusivamente, do elemento humano. Sendo assim, busca-se um processo de avaliação mais auto-suficiente, reduzindo a dependência de um avaliador experiente, que nem sempre e possível nesse processo. Para tal, pensou-se em um ambiente em que: 1) o projetista da aplicação define as tarefas, métricas e questionários que serão aplicados; 2)o usuário execute as tarefas na aplicação alvo; 3)usando-se da técnica de Log de eventos os dados sao armazenados; 4) questionários são respondidos pelo usuário; 5)o ambiente analisa e emite um diagnóstico da avaliação.

Referências

Albert, B. and Tullis, T. (2013). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes, 2nd edition.

Alrawi, L. N. (2021). Understanding the relation between system usability and end user performance. 2nd International Informatics and Software Engineering Conference, IISEC 2021, pages 1–6.

Atanasova, M. and Malinova, A. (2017). Transforming concur task trees model into an abstract user interface. In CBU international conference proceedings, volume 5, pages 1036–1041.

Bačíková, M., Porubän, J., Sulír, M., Chodarev, S., Steingartner, W., and Madeja, M. (2021). Domain usability evaluation. Electronics, 10(16):1963.

Baldez, P., Schot, M., Espíndola, D., Botelho, S. S., Guterres, B., and Soares, L. B. (2019). Semiotics applied to human-robot interaction in welding processes. In 2019 Latin American Robotics Symposium (LARS), 2019 Brazilian Symposium on Robotics (SBR) and 2019 Workshop on Robotics in Education (WRE), pages 269–274. IEEE.

Bevan, N. (2008). Classifying and selecting ux and usability measures. In International Workshop on Meaningful Measures: Valid Useful User Experience Measurement, volume 11, pages 13–18.

Bruun, A. and Stage, J. (2015). New approaches to usability evaluation in software development: Barefoot and crowdsourcing. Journal of Systems and Software, 105:40–53.

Castro, J. W., Garnica, I., and Rojas, L. A. (2022). Automated tools for usability evaluation: A systematic mapping study. In International Conference on Human-Computer Interaction, pages 28–46. Springer.

de Maya, B. N., Komianos, A., Wood, B., de Wolff, L., Kurt, R. E., and Turan, O. (2022). A practical application of the hierarchical task analysis (hta) and human error assessment and reduction technique (heart) to identify the major errors with mitigating actions taken after fire detection onboard passenger vessels. Ocean Engineering, 253:111339.

Drucker, D. P. (2011). Avanços na integração e gerenciamento de dados ecológicos. Natureza & Conservação, 9(1):115–120.

Ferreira, J. M., Acuña, S. T., Dieste, O., Vegas, S., Santos, A., Rodríguez, F., and Juristo, N. (2020). Impact of usability mechanisms: An experiment on efficiency, effectiveness and user satisfaction. Information and Software Technology, 117:106195.

Ferreira, J. M., Rodríguez, F., Santos, A., Dieste, O., Acuña, S. T., and Juristo, N. (2022). ˜Impact of usability mechanisms: A family of experiments on efficiency, effectiveness and user satisfaction. IEEE Transactions on Software Engineering, pages 1–1.

Guo, Q., Wen, L., Wang, J., Yan, Z., and Yu, P. S. (2016). Mining invisible tasks in nonfree-choice constructs. In International Conference on Business Process Management, pages 109–125. Springer.

Hassan, H. M. and Galal-Edeen, G. H. (2017). From usability to user experience. In 2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), pages 216–222. IEEE.

Hodrien, A., Fernando, T., et al. (2021). A review of post-study and post-task subjective questionnaires to guide assessment of system usability. Journal of Usability Studies, 16(3):203–232.

Huser, V. (2012). Process mining: Discovery, conformance and enhancement of business processes.

ISO-9241-11 (2018). Ergonomics of human-system interaction — part 11: Usability: Definitions and concepts.

ISO/IEC-25010 (2011). Systems and software engineering — systems and software quality requirements and evaluation (square) — system and software quality models.

Jorna, R. J. and van Heusden, B. (1993). Signs, search and communication: semiotic aspects of artificial intelligence. Walter de Gruyter, 1st edition.

Lewis, J. R. and Sauro, J. (2012). Quantifying the User Experience: Practical Statistics for User Research. Morgan Kaufmann.

Lewis, J. R. and Sauro, J. (2021). Usability and user experience: Design and evaluation. Handbook of Human Factors and Ergonomics, pages 972–1015.

Moran, K. (2018). Writing tasks for quantitative and qualitative usability studies.

Morandini, M., Cybis, W., and Scapin, D. (2005). Ergomanager: a uims for monitoring and revising user interfaces for web sites. Webist, Miami, USA.

Nielsen, J. and Molich, R. (1990). Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems, pages 249–256.

Polson, P. G., Lewis, C., Rieman, J., and Wharton, C. (1993). Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. International Journal of manmachine studies, 36(5):741–773.

Reichman, O. J., Jones, M. B., and Schildhauer, M. P. (2011). Challenges and opportunities of open data in ecology. Science, 331(6018):703–705.

Riihiaho, S. (2018). Usability testing. The Wiley Handbook of Human Computer Interaction, 1:255–275.

Robal, T., Marenkov, J., and Kalja, A. (2017). Ontology design for automatic evaluation of web user interface usability. In 2017 Portland International Conference on Management of Engineering and Technology (PICMET), pages 1–8. IEEE.

Rodden, K., Hutchinson, H., and Fu, X. (2010). Measuring the user experience on a large scale: user-centered metrics for web applications. In Proceedings of the SIGCHI conference on human factors in computing systems, pages 2395–2398.

Rosala, M. (2020). Task analysis: Support users in achieving their goals.

Sauro, J. (2015). Customer analytics for dummies. John Wiley & Sons.

Şengel, E. and Öncü, S. (2010). Conducting preliminary steps to usability testing: investigating the website of uludag university. Procedia-Social and Behavioral Sciences, 2(2):890–894.

Theis, J. and Darabi, H. (2019). Behavioral petri net mining and automated analysis for human-computer interaction recommendations in multi-application environments. Proceedings of the ACM on Human-Computer Interaction, 3(EICS):1–16.

Van Der Aalst, W. (2016). Process mining: data science in action. Springer, 2nd edition.

Wang, K., Li, K., Gao, J., Liu, B., Fang, Z., and Ke, W. (2021). A quantitative evaluation method of software usability based on improved goms model. In 2021 IEEE 21st International Conference on Software Quality, Reliability and Security Companion (QRS-C), pages 691–697. IEEE.

Young, S. W. (2014). Improving library user experience with a/b testing: Principles and process. Weave: Journal of Library User Experience, 1(1).
Publicado
06/12/2023
GARCIA, Luciano A.; MORANDINI, Marcelo; OLIVEIRA JR, Edson. Uma Proposta para Automatização de Avaliação de Usabilidade/UX. In: ESCOLA REGIONAL DE ENGENHARIA DE SOFTWARE (ERES), 7. , 2023, Maringá/PR. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 308-317. DOI: https://doi.org/10.5753/eres.2023.237799.