The Emotions and Advice in Virtual Assistants: A Dual Study on Emotion Validation and Agent Suggestions in a Gaming Scenario




Virtual Assistant, Emotion Validation, Gaming Scenario, Human Trust, Agent Suggestions


In an era where virtual assistants play an increasingly prominent role in our daily lives, this study explores the implications of their advice. We investigate the interplay between trust and virtual agents’ emotional expressions, delving into a critical aspect of human-technology interaction. Conducted through a comprehensive study comprising two interconnected phases, our research examines the dynamics between virtual agents and human decision-making. The first phase involves developing and validating a virtual robotic agent capable of conveying a spectrum of emotions. Through this, gender-based differences in emotional cue perception are disclosed, shedding light on how men and women interpret these cues differently. The second phase employs an interactive memory game, where the virtual agent operates in varied emotional states. Participants’ trust levels and perceptions are meticulously evaluated in different scenarios, ranging from accurate to erroneous agent cues. Our findings elucidate the impact of the agent’s emotional expressions on participants’ perceptions, illustrating how trust is intricately influenced by both the task at hand and the agent’s behavior. This research contributes to understanding the relationship between virtual assistants and human decision-making, emphasizing the necessity of designing more engaging and interactive virtual agents. These insights prepare future research for crafting more effective virtual assistants, fostering increased user trust and engagement. 


Download data is not yet available.


Bartneck, C., Kulić, D., Croft, E., and Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics, 1(1):71–81.

Benbasat, I. and Wang, W. (2005). Trust in and adoption of online recommendation agents. Journal of the association for information systems, 6(3):4.

Bones (2019). After all, what is rigging and how does it help with character creation? [link].

Cameron, D., Millings, A., Fernando, S., Collins, E. C., Moore, R., Sharkey, A., Evers, V., and Prescott, T. (2018). The effects of robot facial emotional expressions and gender on child–robot interaction in a field study. Connection science, 30(4):343–361.

Chowanda, A., Flintham, M., Blanchfield, P., and Valstar, M. (2016). Playing with social and emotional game companions. In International Conference on Intelligent Virtual Agents, pages 85–95, Springer. Cham.

Cicirelli, G., Marani, R., Petitti, A., Milella, A., and D’Orazio, T. (2021). Ambient assisted living: a review of technologies, methodologies and future perspectives for healthy aging of population. Sensors, 21(10):3549.

Correia, F., Guerra, C., Mascarenhas, S., Melo, F. S., and Paiva, A. (2018). Exploring the impact of fault justification in human-robot trust. In Proceedings of the 17th international conference on autonomous agents and multiagent systems, pages 507–513.

Cuadra, A., Li, S., Lee, H., Cho, J., and Ju, W. (2021). My bad! repairing intelligent voice assistant errors improves interaction. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1):1–24.

Darwin, C. and Prodger, P. (1998). The expression of the emotions in man and animals. Oxford University Press, USA.

de Lima, A. T. (2020). Influência de Agente Virtual Afetivo nas Decisões do Jogador em um Jogo Casual. (Influence of Affective Virtual Agent on Player’s Decisions in a Casual Game), page 26–35. State University of Rio Grande do Norte.

Ekman, P. (1971). Universals and cultural differences in facial expressions of emotion. In Nebraska symposium on motivation. University of Nebraska Press.

Fischer, A. H., Kret, M. E., and Broekens, J. (2018). Gender differences in emotion perception and self-reported emotional intelligence: A test of the emotion sensitivity hypothesis. PloS one, 13(1):e0190712.

Gass, R. H. and Seiter, J. S. (2018). Persuasion: Social influence and compliance gaining. Routledge.

Ghazali, A. S., Ham, J., Barakova, E. I., and Markopoulos, P. (2018). Effects of robot facial characteristics and gender ipersuasive human-robot interaction. Frontiers in Robotics and AI, 5:73.

Guzman, A. L. and Lewis, S. C. (2020). Artificial intelligence and communication: A human–machine communication research agenda. New media & society, 22(1):70–86.

Hancock, P., Billings, D., and Schaefer, K. (2011). Can you trust your robot? Ergonomics in Design: The Quarterly of Human Factors Applications, 19:24–29. DOI: 10.1177/1064804611415045.

Hashemian, M., Paradeda, R., Guerra, C., and Paiva, A. (2019). Do you trust me? investigating the formation of trust in social robots. In EPIA Conference on Artificial Intelligence, pages 357–369, Springer. Cham.

Inkscape (2020). Frequently asked questions — for inkscape users. [link].

Johnston, O. and Thomas, F. (1981). The illusion of life: Disney animation. Disney Editions, New York.

Lewis, M., Sycara, K., and Walker, P. (2018). The role of trust in human-robot interaction. Foundations of trusted autonomy, pages 135–159.

Montagne, B., Kessels, R. P., Frigerio, E., de Haan, E. H., and Perrett, D. I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity?. Cognitive processing, 6(2):136–141.

Moyer-Gusé, E. (2008). Toward a Theory of Entertainment Persuasion: Explaining the Persuasive Effects of Entertainment-Education Messages. Communication Theory, 18(3):407–425.

Nomura, T. (2017). Robots and gender. Gender and the Genome, 1(1):18–25.

Poels, K., de Kort, Y., and Ijsselsteijn, W. (2007). D3. 3: game experience questionnaire.

Ragni, M., Rudenko, A., Kuhnert, B., and Arras, K. O.(2016). Errare humanum est: Erroneous robots in human robot interaction. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages 501–506.

Rosenthal-von der Pütten, A. M., Krämer, N. C., and Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, pages 1–14.

Salem, M., Eyssel, F., Rohlfing, K. J., Kopp, S., and Joublin, F. (2013). To err is human(-like): Effects of robot gesture on perceived anthropomorphism and likability. International Journal of Social Robotics, 5:313–323.

Shapiro, S. S. and Wilk, M. B. (1965). An analysis of variance test for normality (complete samples). Biometrika, 52(3/4):591–611.

Shiban, Y., Schelhorn, I., Jobst, V., Hörnlein, A., Puppe, F., Pauli, P., and Mühlberger, A. (2015). The appearance effect: Influences of virtual agent features on performance and motivation. Computers in Human Behavior, 49:5–11.

Timotheou, S., Miliou, O., Dimitriadis, Y., Sobrino, S. V., Giannoutsou, N., Cachia, R., Mones, A. M., and Ioannou, A. (2023). Impacts of digital technologies on education and factors influencing schools’ digital capacity and transformation: A literature review. Education and information technologies, 28(6):6695–6726.

Torre, I., Carrigan, E., McDonnell, R., Domijan, K., Mc-Cabe, K., and Harte, N. (2019). The effect of multimodal emotional expression and agent appearance on trust in human-agent interaction. In Proceedings of the 12th ACM SIGGRAPH Conference on Motion, Interaction and Games, pages 1–6.

Türkgeldi, B., Özden, C. S., and Aydoğan, R. (2022). The effect of appearance of virtual agents in human-agent negotiation. AI, 3(3):683–701.

Yang, Y., Ma, X., and Fung, P. (2017). Perceived emotional intelligence in virtual agents. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pages 2255–2262.




How to Cite

PARADEDA, R.; ALVES, Álisson; TORRES, D.; LIMA, A. The Emotions and Advice in Virtual Assistants: A Dual Study on Emotion Validation and Agent Suggestions in a Gaming Scenario. Journal on Interactive Systems, Porto Alegre, RS, v. 15, n. 1, p. 118–129, 2024. DOI: 10.5753/jis.2024.3725. Disponível em: Acesso em: 21 feb. 2024.



Regular Paper