VoXED: an online XED file extractor

  • Amadeo Tato Cota Neto UFPE
  • Bruno Cardoso Dantas UFPE
  • Joao Marcelo Xavier Natario Teixeira UFPE
  • Veronica Teichrieb UFPE

Resumo


Kinect devices are used up to this day for image capturing and video recording in many activities. These devices make it possible to capture color and depth data, crucial to many research works. Despite that, these data are stored in a xed file, in the case of the Kinect V1, that cannot be opened directly. Existing tools to extract data from these files include the Kinect Studio application and the xed extractor developed by Daniel Jackson. Although valuable tools, they require a higher computational knowledge from the user since they require either the compilation of a source code or to develop an application that uses the stored data like it was received from a Kinect device while a Kinect device is physically connected to the user’s computer. In this paper, we propose a web service that allows users to retrieve color data from xed files in an easier way than the existing tools, by retrieving a compressed file with the images stored on xed files. The obtained results show that we successfully retrieved the images stored on these files, making it possible to reuse old xed files containing important information such as body posture datasets.
Palavras-chave: Kinect studio, xed file extraction, body pose information

Referências

A. Aloba, G. Flores, J. Woodward, A. Shaw, A. Castonguay, I. Cuba, Y. Dong, E. Jain, and L. Anthony, “Kinder-gator: The uf kinect database of child and adult motion.” in Eurographics (Short Papers), 2018, pp. 13–16.

Y. Zhou, Y. Zhe, X.-d. Xu, J.-s. Zhai, and W. He, “Practice research of classroom teaching system based on kinect,” in 2020 15th International Conference on Computer Science & Education (ICCSE). IEEE, 2020, pp. 572–576.

Y. Tokuyama, R. J. Rajapakse, S. Yamabe, K. Konno, and Y.-P. Hung, “A kinect-based augmented reality game for lower limb exercise,” in 2019 International Conference on Cyberworlds (CW). IEEE, 2019, pp. 399–402.

Y. Luo, T. Wang, A. Zhu, Z. Wang, G. Shan, and H. Snoussi, “Unmanned trolley control based on kinect,” in 2018 Chinese Automation Congress (CAC). IEEE, 2018, pp. 2218–2222.

R. Hong, Z. Wu, T. Zhang, Z. Zhang, A. Lin, X. Su, Y. Jin, Y. Gao, K. Peng, L. Li et al., “Preliminary verification of a kinect-based system for evaluating postural abnormalities in patients with parkinson’s disease,” Parkinsonism & Related Disorders, vol. 113, 2023.

S. Das, A. Adhikary, A. A. Laghari, and S. Mitra, “Eldo-care: Eeg with kinect sensor based telehealthcare for the disabled and the elderly,” Neuroscience Informatics, p. 100130, 2023.

J. C. Chow and D. D. Lichti, “Photogrammetric bundle adjustment with self-calibration of the primesense 3d camera technology: Microsoft kinect,” IEEE Access, vol. 1, pp. 465–474, 2013.

M. J. Landau, B. Y. Choo, and P. A. Beling, “Simulating kinect infrared and depth images,” IEEE transactions on cybernetics, vol. 46, no. 12, pp. 3018–3031, 2015.

T. N. Syed, L. Jizhan, Z. Xin, Z. Shengyi, Y. Yan, S. H. A. Mohamed, and I. A. Lakhiar, “Seedling-lump integrated non-destructive monitoring for automatic transplanting with intel realsense depth camera,” Artificial Intelligence in Agriculture, vol. 3, pp. 18–32, 2019.

A. Z. K. Frisky, A. Harjoko, L. Awaludin, A. Dharmawan, N. G. Augoestien, I. Candradewi, R. M. Hujja, A. Putranto, T. Hartono, Y. Suhartono et al., “Registered relief depth (rrd) borobudur dataset for single-frame depth prediction on one-side artifacts,” Data in Brief, vol. 35, p. 106853, 2021.

N. G. S. S. Srinath, A. Z. Joseph, S. Umamaheswaran, C. L. Priyanka, M. Nair, and P. Sankaran, “Nitcad-developing an object detection, classification and stereo vision dataset for autonomous navigation in indian roads,” Procedia Computer Science, vol. 171, pp. 207–216, 2020.
Publicado
18/10/2023
Como Citar

Selecione um Formato
COTA NETO, Amadeo Tato; DANTAS, Bruno Cardoso; TEIXEIRA, Joao Marcelo Xavier Natario; TEICHRIEB, Veronica. VoXED: an online XED file extractor. In: CONGRESSO LATINO-AMERICANO DE SOFTWARE LIVRE E TECNOLOGIAS ABERTAS (LATINOWARE), 20. , 2023, Foz do Iguaçu/PR. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 33-39. DOI: https://doi.org/10.5753/latinoware.2023.236298.