ABSTRACT
Research in Natural Interfaces, sub-area of Ubiquitous Computing, investigates the use of non-traditional devices to support user interaction with applications in less intrusive ways (gestures, voice and writing based on electronic ink, for instance). With the increasing popularity of accelerometers, developers now have another tool that can be used to provide interaction between users and different applications, such as interactive TV environments. However, applications that make use of accelerometers are currently being developed for specific situations, and their implementations and handled documents are also dependent on the domain for which they were designed. This paper aims to propose a model to formalize how the accelerometer data may be handled in a generic way. In addition, the model enables the description of rules to aggregate value to these data through the addition of meanings. This is done by proposing a layered architecture to structure and share data in a flexible way. An example of use and the description of a simple application that makes use of the proposed model are also presented.
- G. D. Abowd and E. D. Mynatt. Charting past, present, and future research in ubiquitous computing. ACM Trans. Comput.-Hum. Interact., 7(1):29--58, 2000. Google ScholarDigital Library
- R. Aoki, A. Maeda, T. Watanabe, M. Kobayashi, and M. Abe. A new text input method for TV remotes using tilt sensor. IEEE, 2010.Google ScholarCross Ref
- S. Block and A. Popescu. W3C DeviceOrientation Event Specification - Editor's Draft, 2011.Google Scholar
- K. Chang, J. Hightower, and B. Kveton. Inferring identity using accelerometers in television remote controls. In Proceedings of the 7th International Conference on Pervasive Computing, pages 151--167. Citeseer, 2009. Google ScholarDigital Library
- C. Gurrin, H. Lee, N. Caprani, Z. Zhang, N. O'Connor, and D. Carthy. Browsing Large Personal Multimedia Archives in a Lean-Back Environment. Advances in Multimedia Modeling, pages 98--109, 2010. Google ScholarDigital Library
- J. Kela, P. Korpipaa, J. Mantyjarvi, S. Kallio, G. Savino, L. Jozzo, and S. Marca. Accelerometer-based gesture control for a design environment. Personal and Ubiquitous Computing, 10(5):285--299, Aug. 2006. Google ScholarDigital Library
- S. Kim, J. Ok, H. Kang, M. Kim, and M. Kim. An interaction and product design of gesture based TV remote control. In CHI'04 extended abstracts on Human factors in computing systems, pages 1548--1548, New York, New York, USA, 2004. ACM. Google ScholarDigital Library
- E. Mugellini, M. Sokhn, S. Carrino, and O. Abou Khaled. WiiNote: multimodal application facilitating multi-user photo annotation activity. In Proceedings of the 2009 international conference on Multimodal interfaces, pages 237--238. ACM, 2009. Google ScholarDigital Library
- S. Nissen. Implementation of a fast artificial neural network library (fann). Report, Department of Computer Science University of Copenhagen (DIKU), 31, 2003.Google Scholar
- M. Oliveira, P. Cunha, M. da Silva Santos, and J. Bezerra. Implementing home care application in Brazilian Digital TV. In Information Infrastructure Symposium, 2009. GIIS'09. Global, pages 1--7. IEEE, June 2009. Google ScholarDigital Library
- S. Oviatt. Multimodal interfaces, chapter 21, pages 413--432. Lawrence Erlbaum Associates, Inc., Mahwah, NJ, USA, 2007.Google Scholar
- G. Pan, J. Wu, D. Zhang, Z. Wu, Y. Yang, and S. Li. GeeAir: a universal multimodal remote control device for home appliances. Personal and Ubiquitous Computing, 14(8):723--735, Mar. 2010. Google ScholarDigital Library
- H. Sakoe and S. Chiba. Introducing speech and language processing. Cambridge University Press, Cambridge, U. K., 2005.Google Scholar
- T. Schlömer, B. Poppinga, N. Henze, and S. Boll. Gesture recognition with a Wii controller. In Proceedings of the 2nd international conference on Tangible and embedded interaction, pages 11--14, New York, New York, USA, 2008. ACM. Google ScholarDigital Library
- T. Westeyn, H. Brashear, A. Atrash, and T. Starner. Georgia tech gesture toolkit: supporting experiments in gesture recognition. In Proceedings of the 5th international conference on Multimodal interfaces, pages 85--92. ACM, 2003. Google ScholarDigital Library
Index Terms
- Accelerometers data interoperability: easing interactive applications development
Recommendations
GesText: accelerometer-based gestural text-entry systems
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsAccelerometers are common on many devices, including those required for text-entry. We investigate how to enter text with devices that are solely enabled with accelerometers. The challenge of text-entry with such devices can be overcome by the careful ...
Analysis and experiment of error restraint principle in an inertial navigation system with inertial sensors rotation
This research developed a new inertial navigation system INS with two inertial measurement units IMU rotating bi-directionally around the vertical and longitudinal body axis respectively. Theoretical analysis and simulation of the proposed rotation INS ...
Accuracy of Pedometry on a Head-mounted Display
CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing SystemsThe accuracy of pedometry varies depending on where an inertial sensor is located on the body. Motivated by the increasing popularity of wearable computing, this paper investigates the accuracy with which pedometry can be achieved on a head-mounted ...
Comments