(49-1) 13 * << * >> * Русский * English * Содержание * Все выпуски
Weed detection on embedded systems using computer vision algorithms
D. Shadrin 1,2, S. Illarionova 1, R. Kasatov 1,3, M. Akimenkova 1, G. Rudensky 1, E. Erhan 1
1 Skolkovo Institute of Science and Technology,
143026, Bolshoy Bulvar 42, bldg. 1, Skolkovo, Moscow, Russia;
2 Irkutsk National Research Technical University,
664074, Lermantov st. 83, Irkutsk, Russia;
3 ITMO University,
197101, Kronverksky Pr. 49, bldg. A, St. Petersburg, Russia
PDF, 24 MB
DOI: 10.18287/2412-6179-CO-1454
Страницы: 103-111.
Язык статьи: English.
Аннотация:
Agriculture is a vital component of a sustainable development of many states. It supports economic growth and ensures food security. Therefore, great attention is paid to increasing production efficiency and yields. One of the problems occurring in the agricultural section is weed spreading that can corrupt the quality and amount of yields. To achieve better harvest, weed control measures should be conducted in time. Currently, computer vision techniques are implemented in various areas of industry, in particular, in agriculture. They allow one to automate data analysis process and to make decisions faster. However, the weed detection task in agriculture requires not only high recognition accuracy, but also fast computations on portable devices with low memory availability that makes it possible to embed computer vision systems on unmanned aerial vehicles (UAVs). To address these challenges, we proposed a neural-based approach for real-time weed recognition that combines state-of-the-art detection architectures and optimization techniques for faster inference. To conduct a comprehensive study using real field data, we collected and labelled two unique datasets in Volgograd Region. The experiments involved YOLO, SSD, and Faster R-CNN architectures with inference on NVIDIA Jetson Nano. The highest results were achieved for YOLOv5 architecture with mAP of 0.668 for Carrot Dataset (two weeds classes) and 0.882 for Onion Dataset (one weed class), while inference prediction time equals to 29 FPS and 31 FPS respectively.
Ключевые слова:
weed detection, computer vision, deep learning, precision agriculture.
Благодарности
The work of Svetlana Illarionova was supported by the Russian Science Foundation (Project No. 23-71-01122).
Citation:
Shadrin D, Illarionova S, Kasatov R, Akimenkova M, Rudensky G, Erhan E. Weed detection on embedded systems using computer vision algorithms. Computer Optics 2025; 49(1): 103-111. DOI: 10.18287/2412-6179-CO-1454.
References:
- Korres NE, Burgos NR, Travlos I, et al. New directions for integrated weed management: Modern technologies, tools and knowledge discovery. Adv Agron 2019; 155: 243-319. DOI: 10.1016/bs.agron.2019.01.006.
- Gage KL, Krausz RF, Walters SA. Emerging challenges for weed management in herbicide-resistant crops. Agriculture 2019; 9(8): 180. DOI: 10.3390/agriculture9080180.
- Wang A, Zhang W, Wei X. A review on weed detection using ground-based machine vision and image processing techniques. Comput Electron Agr 2019; 158: 226-240. DOI: 10.1016/j.compag.2019.02.005.
- Lavoie C, Jodoin Y, De Merlis AG. How did common ragweed (Ambrosia artemisiifolia L.) spread in Québec? A historical analysis using herbarium records. J Biogeogr 2007; 34(10): 1751-1761. DOI: 10.1111/j.1365-2699.2007.01730.x.
- Mahmud MSA, Abidin MSZ, Emmanuel AA, Hasan HS. Robotics and automation in agriculture: present and future applications. Applications of Modelling and Simulation 2020; 4: 130-140.
- Illarionova S, Shadrin D, Tregubova P, Ignatiev V, Efimov A, Oseledets I, Burnaev E. a survey of computer vision techniques for forest characterization and carbon monitoring tasks. Remote Sens 2022; 14(22): 5861. DOI: 10.3390/rs14225861.
- Menshchikov A, Shadrin D, Prutyanov V, Lopatkin D, Sosnin S, Tsykunov E, Iakovlev E, Somov A. Real-time detection of hogweed: UAV platform empowered by deep learning. IEEE Trans Comput 2021; 70(8): 1175-1188. DOI: 10.1109/TC.2021.3059819.
- Wu Z, Chen Y, Zhao B, Kang X, Ding Y. Review of weed detection methods based on computer vision. Sensors 2021; 21(11): 3647. DOI: 10.3390/s21113647.
- Hasan ASMM, Sohel F, Diepeveen D, Laga H, Jones MGK. A survey of deep learning techniques for weed detection from images. Comput Electron Agr 2021; 184: 106067. DOI: 10.1016/j.compag.2021.106067.
- Alam MS, Alam M, Tufail M, Khan MU, Güneş A, Salah B, Nasir FE, Saleem W, Khan MT. TobSet: A new tobacco crop and weeds image dataset and its utilization for vision-based spraying by agricultural robots. Appl Sci 2022; 12(3): 1308. DOI: 10.3390/app12031308.
- Le VNT, Truong G, Alameh K. Detecting weeds from crops under complex field environments based on Faster RCNN. 2020 IEEE Eighth Int Conf on Communications and Electronics (ICCE) 2021: 350-355. DOI: 10.1109/ICCE48956.2021.9352073.
- Gao J, French AP, Pound MP, He Y, Pridmore TP, Pieters JG. Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant Methods 2020; 16: 29. DOI: 10.1186/s13007-020-00570-z.
- Revanasiddappa B, Arvind C, Swamy S, et al. Real-time early detection of weed plants in pulse crop field using drone with IoT. Technology 2020; 16: 1227-1242. DOI: 10.13140/RG.2.2.27656.03845.
- Psiroukis V, Espejo-Garcia B, Chitos A, Dedousis A, Karantzalos K, Fountas S. Assessment of different object detectors for the maturity level classification of broccoli crops using UAV imagery. Remote Sens 2022; 14(3): 731. DOI: 10.3390/rs14030731.
- Deng J, Zhong Z, Huang H, Lan Y, Han Y, Zhang Y. Lightweight semantic segmentation network for real-time weed mapping using unmanned aerial vehicles. Appl Sci 2020; 10(20): 7132. DOI: 10.3390/app10207132.
- Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. NIPS'15: Proc 28th Int Conf on Neural Information Processing Systems 2015; 1: 91-99.
- Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. 2016 IEEE Conf on Computer Vision and Pattern Recognition (CVPR) 2016: 779-788. DOI: 10.1109/CVPR.2016.91.
- Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu C-Y, Berg AC. SSD: Single shot multibox detector. In Book: Leibe B, Matas J, Sebe N, Welling M, eds. Computer Vision – ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I. Cham, Switzerland: Springer International Publishing AG; 2016: 21-37. DOI: 10.1007/978-3-319-46448-0_2.
- Girshick R. Fast R-CNN. 2015 IEEE Int Conf on Computer Vision (ICCV) 2015: 1440-1448. DOI: 10.1109/ICCV.2015.169.
- Wan S, Goudos S. Faster R-CNN for multi-class fruit detection using a robotic vision system. Computer Networks 2020; 168: 107036. DOI: 10.1016/j.comnet.2019.107036.
- Dolgaia L, Illarionova S, Nesteruk S, Krivolapov I, Baldycheva A, Somov A, Shadrin D. Apple tree health recognition through the application of transfer learning for UAV imagery. 2023 IEEE 28th Int Conf on Emerging Technologies and Factory Automation (ETFA) 2023: 1-8. DOI: 10.1109/ETFA54631.2023.10275369.
- Wang Y, Xing Z, Ma L, Qu A, Xue J. Object detection algorithm for lingwu long jujubes based on the improved SSD. Agriculture 2022; 12(9): 1456. DOI: 10.3390/agriculture12091456.
- Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H. MobileNets: Efficient convolutional neural networks for mobile vision applications. arXiv Preprint. 2017. Source: <https://arxiv.org/abs/1704.04861>. DOI: 10.48550/arXiv.1704.04861.
- Sandler M, Howard A, Zhu M, Zhmoginov A, Chen L-C. MobileNetV2: Inverted residuals and linear bottlenecks. Proc IEEE Conf on Computer Vision and Pattern Recognition 2018: 4510-4520. DOI: 10.1109/CVPR.2018.00474.
- Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv Preprint. 2014. Source: <https://arxiv.org/abs/1409.1556>. DOI: 10.48550/arXiv.1409.1556.
- Jeong EJ, Kim J, Ha S. TensorRT-based framework and optimization methodology for deep learning inference on Jetson boards. ACM Transactions on Embedded Computing Systems (TECS) 2022; 21(5): 51. DOI: 10.1145/3508391.
- Padilla R, Passos WL, Dias TLB, Netto SL, Da Silva EAB. A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics 2021; 10(3): 279. DOI: 10.3390/electronics10030279.
- Zhu H, Wei H, Li B, Yuan X, Kehtarnavaz N. A review of video object detection: Datasets, metrics and methods. Appl Sci 2020; 10(21): 7834. DOI: 10.3390/app10217834.
- Illarionova S, Shadrin D, Ignatiev V, Shayakhmetov S, Trekin A, Oseledets I. Augmentation-based methodology for enhancement of trees map detalization on a large scale. Remote Sens 2022; 14(9): 2281. DOI: 10.3390/rs14092281.
- Huang G, Laradji I, Vazquez D, Lacoste-Julien S, Rodriguez P. A survey of self-supervised and few-shot object detection. IEEE Transactions on Pattern Analysis and Machine Intelligence 2022; 45(4): 4071-4089. DOI: 10.1109/TPAMI.2022.3199617.
- Nesteruk S, Illarionova S, Zherebzov I, Traweek C, Mikhailova N, Somov A, Oseledets I. PseudoAugment: Enabling smart checkout adoption for new classes without human annotation. IEEE Access 2023; 11: 76869-76882. DOI: 10.1109/ACCESS.2023.3296854.
- Mwitta C, Rains GC, Prostko E. Evaluation of diode laser treatments to manage weeds in row crops. Agronomy 2022; 12(11): 2681. DOI: 10.3390/agronomy12112681.
© 2009, IPSI RAS
Россия, 443001, Самара, ул. Молодогвардейская, 151; электронная почта: journal@computeroptics.ru; тел: +7 (846) 242-41-24 (ответственный секретарь), +7 (846) 332-56-22 (технический редактор), факс: +7 (846) 332-56-20