VISCO (VIEW, SCAN, AND CONTROL IT): uso de visão computacional para descoberta de serviços em ambientes residenciais inteligentes

  • Paulo Filipe Dantas UFC
  • José Gilvan Rodrigues Maia UFC
  • Windson Viana UFC


The widespread of smart objects in our daily lives request the creation and analysis of new service discovery mechanisms and interaction techniques. In this work, we designed and evaluated a pointing-based interaction mechanism based on a Convolutional Neural Network classification method. We called it ViSCo (View, Scan, and Control it), which extends the openHAB service discovery mechanism of smart objets. ViSCo aggregates the users’ field of view, captured by the camera of their smartphones, to reduce the service discovery results. 17 users evaluated the final solution remotely, in an environment with virtual devices. Participants used the ViSCo approach to find and control virtual devices by pointing to real objects in their homes (e.g., their TVs). System Usability Scale (SUS) survey about ViSCo results showed a good level of acceptance, with an average score of 83.97.

Palavras-chave: IoT, Smart Home, CNN, computer vision, object classification


Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 4, 3 (2009), 114–123.

Paulo Filipe Dantas, José Gilvan Rodrigues Maia, and Windson Viana. 2021. Point and Control It! Using Computer Vision for Service Discovery to Control Smart Objects WebMedia ’21). Association for Computing Machinery, New York, NY, USA, 153–160.

Andrew G Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam. 2017. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861 (2017).

Jung-Hwa Kim, Seung-June Choi, and Jin-Woo Jeong. 2019. Watch & Do: A smart iot interaction system with object detection and gaze estimation. IEEE Transactions on Consumer Electronics 65, 2 (2019), 195–204.

James R Lewis. 2018. The system usability scale: past, present, and future. International Journal of Human–Computer Interaction 34, 7 (2018), 577–590.

Jakob Nielsen. 2000. Why you only need to test with 5 users.

Thomas P Novak and Donna L Hoffman. 2019. Relationship journeys in the internet of things: a new framework for understanding interactions between consumers and smart objects. Journal of the Academy of Marketing Science 47, 2 (2019), 216–237.

Enrico Rukzio, Gregor Broll, Karin Leichtenstern, and Albrecht Schmidt. 2007. Mobile interaction with the real world: An evaluation and comparison of physical mobile interaction techniques. In European Conference on Ambient Intelligence. Springer, 1–18.

Julian Seifert, Andreas Bayer, and Enrico Rukzio. 2013. PointerPhone:Using Mobile Phones for Direct Pointing Interactions with Remote Displays. In Human-Computer Interaction – INTERACT 2013, Paula Kotzé, Gary Marsden, Gitte Lindgaard, Janet Wesson, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 18–35.

Webmedia. 2021. Webmedia 2021 » Premiações.

Frank Wilcoxon. 1992. Individual comparisons by ranking methods.In Breakthroughs in statistics. Springer, 196–202.

WordNet. 2020. WordNet | A Lexical Database for English. (Accessed on 03/19/2020).

Robert Xiao, Gierad Laput, Yang Zhang, and Chris Harrison. 2017. Deus EM Machina: On-Touch Contextual Functionality for Smart IoT Appliances. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 4000–4008.
DANTAS, Paulo Filipe; MAIA, José Gilvan Rodrigues; VIANA, Windson. VISCO (VIEW, SCAN, AND CONTROL IT): uso de visão computacional para descoberta de serviços em ambientes residenciais inteligentes. In: CONCURSO DE TESES E DISSERTAÇÕES - SIMPÓSIO BRASILEIRO DE SISTEMAS MULTIMÍDIA E WEB (WEBMEDIA), 28. , 2022, Curitiba. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2022 . p. 23-26. ISSN 2596-1683. DOI: