An adaptable mobile robot platform with vision-based perception for precision agriculture
Resumo
This paper presents the development of a versatile mobile robot platform designed for precision agriculture. The robot’s proposed overview architecture and its manufacturing are presented while it further discusses the incorporated vision-based perception modules for road segmentation in farm environments and maize stem detection. For road segmentation, the Segment Anything Model (SAM) based on Zero-Shot Segmentation was utilized. The SAM algorithm effectively extracted navigable spaces in challenging scenarios, demonstrating its robustness and adaptability. Furthermore, considerations were made for computational efficiency, motivating future implementation on low-power devices. In the maize stem detection module, a comprehensive dataset of maize stem images obtained from a local agricultural field was created. The images were processed using the YOLOv5 model, resulting in a highly accurate and efficient maize stem detection module. The validation of both perception modules highlights the successful integration of vision-based technologies into the platform. The platform’s adaptability and robustness make it a valuable tool for precision agriculture applications. By leveraging these technologies, the proposed vehicle contributes to improving crop monitoring and management, enhancing overall agricultural practices.
Palavras-chave:
robots and Automation in Agriculture and Forestry, Vision-Based Navigation, Deep Learning for Visual Perception, Product Design, Development and Prototyping
Publicado
09/10/2023
Como Citar
OLIVEIRA, Hugo; VANGASSE, Arthur; SOARES, Lívia; OLIVEIRA, Andressa; FERREIRA, Bruno; LEITE, Glauber; ARAÚJO, Ícaro; BRITO, Davi.
An adaptable mobile robot platform with vision-based perception for precision agriculture. In: SIMPÓSIO BRASILEIRO DE ROBÓTICA E SIMPÓSIO LATINO AMERICANO DE ROBÓTICA (SBR/LARS), 15. , 2023, Salvador/BA.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2023
.
p. 466-471.