SENSYNC: Instrumentalização de Plataforma Veicular para Testes de Sincronização Multissensor e Geração de Datasets

  • Alexsandro Ferreira Coelho UFPE
  • Abel G. Silva-Filho UFPE
  • Alef Gabryel Lorenco da Costa UFPE
  • Lucas Alves Barbosa UFPE

Resumo


Multimodal perception systems used in autonomous vehicles rely on precise temporal synchronization between sensors such as cameras, radars, and LiDAR. Small misalignments of only a few milliseconds can compromise perception and decision-making in dynamic scenarios. This work presents the SENSYNC architecture, an entirely software-based solution built on ROS 2 for acquiring and synchronizing sensors in a Jeep Renegade equipped with three cameras, three FMCW radars, and a VLP-16 LiDAR. The platform is evaluated under real urban traffic conditions, quantifying temporal offsets between sensors and the robustness of multimodal recording. The results show average offsets close to 1 ms for radar/camera pairs and typically below 25 ms for radar/LiDAR pairs, values suitable for multimodal fusion in urban environments. Approximately 12 GB of multimodal data were collected, with an average recording rate of 29 MB/s (1.74 GB/min). These results confirm the feasibility of achieving fully software-based synchronization in real vehicles.

Palavras-chave: Synchronization, ROS 2, Sensors, Autonomous Vehicles, Multimodal Perception

Referências

J. Gu, A. Lind, T. R. Chhetri, M. Bellone, and R. Sell, ”End-toend multimodal sensor dataset collection framework for autonomous vehicles,” Sensors, vol. 23, no. 15, p. 6783, 2023.

A. K. Tyagi and S. U. Aswathy, ”Autonomous intelligent vehicles (AIV): Research statements, open issues, challenges and road for future,” International Journal of Intelligent Networks, vol. 2, pp. 83–102, 2021.

Y. Ye, Z. Nie, X. Liu, F. Xie, Z. Li, and P. Li, ”ROS 2 real-time performance optimization and evaluation,” Chinese Journal of Mechanical Engineering, vol. 36, art. no. 144, 2023.

H. Caesar, V. Bankiti, A. H. Lang, S. Vora, V. E. Liong, Q. Xu, and O. Beijbom, ”nuScenes: A multimodal dataset for autonomous driving,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020.

K. Huang, B. Shi, X. Li, X. Li, S. Huang, and Y. Li, ”Multi-modal sensor fusion for auto driving perception: A survey,” arXiv preprint arXiv:2202.02703, 2022.

X. Wu, H. Sun, R. Wu, and Z. Fang, ”EverySync: An Open Hardware Time Synchronization Sensor Suite for Common Sensors in SLAM,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), 2024, pp. 12587–12593.

I. P. Junior, L. P. Horstmann, and A. A. Fröhlich, ”Enabling Time Synchronization with Hardware-in-the-Loop Integration on a Data-Driven Middleware for Autonomous Vehicles Simulations,” in Proc. Workshop Latinoamericano Dependab. Segur. Sist. Veícul., 2024, pp. 5–8.

H. Hu, J. Wu, and Z. Xiong, ”A Soft Time Synchronization Framework for Multi-Sensors in Autonomous Localization and Navigation,” in Proc. IEEE/ASME Int. Conf. Adv. Intell. Mechatron. (AIM), 2018, pp. 694–699.

Dataset SeSync, ”Dataset SeSync [Online],” Google Drive, 2025. Available: [link]. [Accessed: Oct. 2, 2025].
Publicado
24/11/2025
COELHO, Alexsandro Ferreira; SILVA-FILHO, Abel G.; COSTA, Alef Gabryel Lorenco da; BARBOSA, Lucas Alves. SENSYNC: Instrumentalização de Plataforma Veicular para Testes de Sincronização Multissensor e Geração de Datasets. In: WORKSHOP LATINOAMERICANO DE DEPENDABILIDADE E SEGURANÇA EM SISTEMAS VEICULARES (SSV), 2. , 2025, Campinas/SP. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 17-20.