Um Estudo Comparativo de Redes Convolucionais Profundas para Detecção de Insetos em Imagens

  • Jéssica Regina Di Domênico IFSul
  • Douglas Lau Embrapa
  • Daniel Delfini Ribeiro IFSul
  • Rafael Rieder UPF
  • Telmo De Cesaro Júnior IFSul

Abstract


This work presents a comparative study between two deep convolutional network models in tasks of identification and counting of insects in digital images, considering aphids (Hemiptera: Aphididae) and parasitoids (Hymenoptera: Aphelinidae and Braconidae, Aphidiinae). In this case study, each image can contain hundreds of specimens, debris, overlaps, and other insects with similar morphology, making the detection process difficult. In this sense, we compared the results obtained by the InsectCV system, which was based on Mask R-CNN, in terms of training time, inference, and precision, with a new model, trained with the DarkNet network. Using grayscale images with smaller dimensions, processing via GPU, and a one-stage convolutional network, it is possible to reduce the computational cost and increase the precision in the object detection task. Based on the 580 images used to validate the proposed model, it was possible to obtain a mean Average Precision of 79.9%.

References

M. Savaris, S. Lampert, J. Salvadori, D. Lau, P. d. S. Pereira, and M. Smaniotto, Population growth and damage caused by rhopalosiphum padi (l.)(hemiptera, aphididae) on different cultivars and phenological stages of wheat, Neotropical entomology, vol. 42, no. 5, pp. 539543, 2013.

G. D. HEATHCOTE, The comparison of yellow cylindrical, flat and water traps, and of johnson suction traps, for sampling aphids, Annals of Applied Biology, vol. 45, no. 1, pp. 133 139, 1957. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1744-7348.1957.tb00449.x

R. Morris, First experiences with water traps, Methodology, 2018.

L. C. Wright and W. W. Cone, Population Dynamics of Brachycorynella asparagi (Homoptera: Aphididae) on Undisturbed Asparagus in Washington State, Environmental Entomology, vol. 17, no. 5, pp. 878 886, 10 1988.

D. Lau, Plataforma integrada para monitoramento, simulação e tomada de decisão no manejo de epidemias causadas por vírus transmitidos por insetos. in X Simpósio Sobre Atualidades em Fitopatologia, 2020, pp. 8391. [Online]. Available: [link].

E. Trigo, Rede de monitoramento de pragas em cereais de inverno, 2015. [Online]. Available: [link].

E. A. Lins, J. P. M. Rodriguez, S. I. Scoloski, J. Pivato, M. B. Lima, J. M. C. Fernandes, P. R. V. da Silva Pereira, D. Lau, and R. Rieder, A method for counting and classifying aphids using computer vision, Computers and Electronics in Agriculture, vol. 169, p. 105200, 2020. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0168169919306039

T. De Cesaro Jr., InsectCV: um sistema para detecção de insetos em imagens digitais, Masters thesis, Programa de Pós-Graduação em Computação Aplicada, 2020, instituto de Ciências Exatas e Geociencias ICEG. [Online]. Available: http://tede.upf:br:8080/jspui/handle/tede/1956

T. De Cesaro Jr., R. Rieder, D. Lau, and J. R. D. Domenico, InsectCV, Programa de Computador. Número do registro: BR512021000542-2, data de registro: 19/03/2021, Instituição de registro: INPI - Instituto Nacional da Propriedade Industrial.

J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, You only look once: Unified, real-time object detection, in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 779-788.

J. Redmon, Darknet: Open source neural networks in c, http://pjreddie.com/darknet/, 2013-2016.

A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, Yolov4: Optimal speed and accuracy of object detection, 2020.

A. Kamilaris and F. X. Prenafeta-Boldu, Deep learning in agriculture: A survey, Computers and Electronics in Agriculture, vol. 147, pp. 7090, 2018. [Online]. Available: https://doi.org/10.1016/j.compag.2018.02.016

T. De Cesaro Jr. and R. Rieder, Automatic identification of insects from digital images: A survey, Computers and Electronics in Agriculture, vol. 178, p. 105784, 2020. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0168169920311224

Y. LeCun, Y. Bengio, and G. Hinton, Deep learning, Nature, vol. 521, no. 7553, pp. 436444, 2015. [Online]. Available: https://doi.org/10.1038/nature14539

Y. Sun, X. Liu, M. Yuan, L. Ren, J. Wang, and Z. Chen, Automatic in-trap pest detection using learning for pheromonebased Dendroctonus valens monitoring, Biosystems Engineering, vol. 176, pp. 140150, dec 2018. [Online]. Available: https://doi.org/10.1016/j.biosystemseng.2018.10.012

S. Ren, K. He, R. Girshick, and J. Sun, Faster r-cnn: Towards real-time object detection with region proposal networks, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 6, pp. 11371149, 2016. [Online]. Available: https://doi.org/10.1109/TPAMI.2016.2577031

J. Dai, Y. Li, K. He, and J. Sun, R-fcn: Object detection via region-based fully convolutional networks, in Proceedings of the 30th International Conference on Neural Information Processing Systems, ser. NIPS16. Red Hook, NY, USA: Curran Associates Inc., 2016, p. 379387. [Online]. Available: https://dl.acm.org/doi/10.5555/3157096.3157139

K. He, G. Gkioxari, P. Dollar, and R. Girshick, Mask r-cnn, in 2017 IEEE International Conference on Computer Vision (ICCV), 2017, pp. 29802988. [Online]. Available: https://doi.org/10.1109/ICCV.2017.322

W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, and A. C. Berg, Ssd: Single shot multibox detector, in Computer Vision ECCV 2016, B. Leibe, J. Matas, N. Sebe, and M. Welling, Eds. Cham: Springer International Publishing, 2016, pp. 2137. [Online]. Available: https://doi.org/10.1007/978-3-319-46448-0_2

T. Lin, P. Goyal, R. Girshick, K. He, and P. Dollar, Focal loss for dense object detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 42, no. 2, pp. 318327, 2018. [Online]. Available: https://doi.org/10.1109/TPAMI.2018.2858826

M. Saeidi and A. Ahmadi, "Deep Learning Based on Parallel CNNs for Pedestrian Detection", International Journal of Information & Communication Technology Research, vol. 10, no. 4, 2018. [Online]. Available: http://ijict.itrc.ac.ir/article-1-410-en.html

C.-Y.Wang, A. Bochkovskiy, and H. Liao, Scaled-yolov4: Scaling cross stage partial network, ArXiv, vol. abs/2011.08036, 2020.

J. Redmon and A. Farhadi, Yolo9000: Better, faster, stronger, arXiv preprint arXiv:1612.08242, 2016.

, Yolov3: An incremental improvement, arXiv preprint arXiv:1804.02767, 2018.

G. Jocher, yolov5: v5.0 - yolov5-p6 1280 models, aws, supervise.ly and youtube integrations, 2020. [Online]. Available: https://zenodo.org/record/4679653#.YOSVzXDPzIV

X. Long, K. Deng, G. Wang, Y. Zhang, Q. Dang, Y. Gao, H. Shen, J. Ren, S. Han, E. Ding, and S. Wen, Pp-yolo: An effective and efficient implementation of object detector, 2020.

A. Bochkovskiy, Darknet. yolov4 / scaled-yolov4 / yolo - neural networks for object detection (windows and linux version of darknet ). [Online]. Available: https://github.com/AlexeyAB/darknet

S. Charette, Darkhelp. c++ wrapper library for darknet. [Online]. Available: https://www.ccoderun.ca/darkhelp/api/index.html
Published
2021-10-18
DI DOMÊNICO, Jéssica Regina; LAU, Douglas; RIBEIRO, Daniel Delfini; RIEDER, Rafael; CESARO JÚNIOR, Telmo De. Um Estudo Comparativo de Redes Convolucionais Profundas para Detecção de Insetos em Imagens. In: WORKSHOP OF UNDERGRADUATE WORKS - CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 34. , 2021, Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . p. 183-188. DOI: https://doi.org/10.5753/sibgrapi.est.2021.20036.

Most read articles by the same author(s)

1 2 > >>