Towards Affective TV with Facial Expression Recognition

  • Pedro A. Valentim Universidade Federal Fluminense
  • Fábio Barreto Universidade Federal Fluminense
  • Débora C. Muchaluat-Saade Universidade Federal Fluminense

Abstract


Facial recognition techniques, fantasized in fiction movie classics, have already become reality. Such technology opens up a wide range of possibilities for different kinds of systems. From the point of view of interactive applications, facial expression as input data may be more immediate and more trustworthy to the user’s sentiment than the click of a button. For interactive television, facial expression recognition could be used for bringing broadcasters and viewers closer, enabling TV content to be personalized by the user sentiment. In fact, not only facial expression recognition, but any interaction that enables affective computing. In this work, we call this concept Affective TV. In order to support it, this work proposes facial expression recognition for digital TV applications. Our proposal is implemented and evaluated in the Ginga-NCL middleware, a digital TV standard used in several Latin American countries.

Keywords: Facial Recognition; Affective Computing; Digital Television; Multimedia Applications; Affective TV

References

Sandra Baldassarri, Isabelle Hupont, David Abadía, and Eva Cerezo. 2015. Affective-aware tutoring platform for interactive digital television. Multimedia Tools and Applications 74, 9 (01 May 2015), 3183–3206. https://doi.org/10.1007/s11042-013-1779-z

Fábio Barreto, Raphael S. de Abreu, Eyre Brasil B. Montevecchi, Marina I. P. Josué, Pedro A. Valentim, and Debora C. Muchaluat-Saade. 2020. Extending Ginga-NCL to Specify Multimodal Interactions With Multiple Users. In Brazilian Symposium on Multimedia and the Web. ACM.

Simon H Budman. 2000. Behavioral health care dot-com and beyond: Computer-mediated communications in mental health and substance abuse treatment. American Psychologist 55, 11 (2000), 1290.

Joseph Bullington. 2005. ’Affective’computing and emotion recognition systems: the future of biometric surveillance?. In Proceedings of the 2nd annual conference on Information security curriculum development. 95–99.

Konstantinos Chorianopoulos and Diomidis Spinellis. 2004. Affective Usability Evaluation for an Interactive Music Television Channel. Comput. Entertain. 2, 3 (July 2004), 14. https://doi.org/10.1145/1027154.1027177

Jeff F Cohn and Fernando De la Torre. 2015. Automated face analysis for affective computing. (2015).

U Dimberg, M Thunberg, and K Elmehed. [n.d.]. Unconscious facial reactions to emotional facial expressions. Psychological science 11, 1 ([n. d.]).

ITU. 2014. Nested Context Language (NCL) and Ginga-NCL. http://www.itu.int/rec/T-REC-H.761. ITU-T Recommendation H.761.

K Liu, M Zhang, and Z Pan. [n.d.]. Facial expression recognition with CNN ensemble. In 2016 international conference on cyberworlds.

P Lucey, J F Cohn, T Kanade, J Saragih, Z Ambadar, and I Matthews. [n.d.]. The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-workshops.

D McDuff, A Mahmoud, M Mavadati, M Amr, J Turcot, and R Kaliouby. 2016. AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit. In Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. 3723–3726.

Rosalind W. Picard. 1997. Affective Computing. MIT Press, Cambridge, MA, USA.

Minchul Shin, Munsang Kim, and Dong-Soo Kwon. 2016. Baseline CNN structure analysis for facial expression recognition. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 724–729.
Published
2021-06-21
VALENTIM, Pedro A.; BARRETO, Fábio; MUCHALUAT-SAADE, Débora C.. Towards Affective TV with Facial Expression Recognition. In: LIFE IMPROVEMENT IN QUALITY BY UBIQUITOUS EXPERIENCES WORKSHOP (LIQUE), 1. , 2021, New York. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . DOI: https://doi.org/10.5753/lique.2021.15716.