Natural Language-Based Assisted Navigation in a Smart Walker
Resumo
Mobility is a determining factor for maintaining quality of life of individuals, especially in a world where the overall population is continuously aging. In this regard, smart walkers (i.e., traditional walkers augmented with sensors and intelligent systems) emerge as a promising solution for gait monitoring and locomotion assistance. Human-robot interaction is a fundamental area in smart walker development and, with the rise of Large Language Models (LLMs), there is an opportunity to enable more natural and adaptive channels of communication for context-aware interaction. In this work, we propose an intelligent system for assisted navigation in smart walkers that integrates an LLM-based multi-agent with ROS 2 to dynamically set interaction modes. The system has three stages: command detection, operating mode definition, and secondary mode selection. To validate the proposed solution, two datasets were created: one with textual commands and another with voice commands transcribed using Whisper. The results demonstrate promising performance, with accuracy, precision, recall, and F1-score values above 86 percent in all stages for both scenarios. The observed results pave the way for practical implementations on the robot itself and for validation on realistic scenarios.
