Specification of Multimodal Interactions in NCL

  • Álan Lívio Vasconcelos Guedes PUC-Rio
  • Roberto Gerson De Albuquerque Azevedo PUC-Rio
  • Márcio Ferreira Moreno PUC-Rio
  • Luiz Fernando Gomes Soares PUC-Rio

Resumo


This paper proposes an approach to integrate multimodal events— both user-generated, e.g., audio recognizer, motion sensors; and user-consumed, e.g., speech synthesizer, haptic synthesizer—into programming languages for the declarative specification of multimedia applications. More precisely, it presents extensions to the NCL (Nested Context Language) multimedia language. NCL is the standard declarative language for the development of interactive applications for Brazilian Digital TV and an ITU-T Recommendation for IPTV services. NCL applications extended with the multimodal features are presented as results. Historically, Human-Computer Interaction research community has been focusing on user-generated modalities, through studies on the user interaction. On the other hand, Multimedia community has been focusing on output modalities, through studies on timing and multimedia processing. The proposals in this paper is an attempt to integrate concepts of both research communities in a unique high-level programming framework, which aims to assist the authoring of multimedia/multimodal applications.
Publicado
27/10/2015
Como Citar

Selecione um Formato
GUEDES, Álan Lívio Vasconcelos; AZEVEDO, Roberto Gerson De Albuquerque; MORENO, Márcio Ferreira; SOARES, Luiz Fernando Gomes. Specification of Multimodal Interactions in NCL. In: SIMPÓSIO BRASILEIRO DE SISTEMAS MULTIMÍDIA E WEB (WEBMEDIA), 21. , 2015, Manaus. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2015 . p. 181-187.

Artigos mais lidos do(s) mesmo(s) autor(es)

1 2 > >>