Robotic Control with Pattern Recognition by Dynamic ImageSegmentation
This paper creates a methodology capable of performing gesture recognition, where the idea is to extract characteristics of the segmented hand, from dynamic images captured from a webcam, and to identify signal patterns. With the creation of this mechanism it will be possible develop tools to facilitate the manipulation of an robotic arm that performs specific movements. The method used consists of the Continuously Adaptive Mean-SHIFT algorithm, Canny operator and Deep Learning through Convolutional Neural Network. The method obtains a accuracy rate of 97.50% in recognizing the gesture patterns as observed in the statistical data obtained.
Congalton, R. G. (1991). A review of assessing the accuracy of classifications of remotely sensed data. Remote Sensing of Environment, 37(1):35–46.
Landis, J. R. and Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1).
Pisharady, P. K., Vadakkepat, P., and Loh, A. P. (2013). Attention based detection and recognition of hand postures against complex backgrounds. International Journal of Computer Vision, 101(3):403–419.
Raheja, J. L., Shyam, R., Kumar, U., and Prasad, P. B. (2010). Real-time robotic hand control using hand gestures. In Machine Learning and Computing (ICMLC), 2010 Second International Conference on, pages 12–16. IEEE.
Triesch, J. and Von Der Malsburg, C. (1996). Robust classification of hand postures against complex backgrounds. In Automatic Face and Gesture Recognition, 1996., Proceedings of the Second International Conference on, pages 170–175. IEEE.