(49-1) 11 * << * >> * Русский * English * Содержание * Все выпуски

Quality inspection of fertilizer granules using computer vision – a review
I.K. Ndukwe 1, D.V. Yunovidov 2, M.R. Bahrami 1,4, M. Mazzara 1, T.O. Olugbade 3

Innopolis University,
University Street 1, Innopolis, Tatarstan, 420500, Russia;
Logic Yield,
Gvardeyskaya Street 14, Kazan, Tatarstan, 420073, Russia;
University of Dundee,
Dundee, Scotland, DD1 4HN, United Kingdom;
Samarkand International University of Technology,
Samarkand 140100, Uzbekistan

  PDF, 1179 kB

DOI: 10.18287/2412-6179-CO-1458

Страницы: 84-94.

Язык статьи: English.

Аннотация:
This research explores the fusion of computer vision and agricultural quality control. It investigates the efficacy of computer vision algorithms, particularly in image classification and object detection, for non-destructive assessment. These algorithms offer objective, rapid, and error-resistant analysis compared to human inspection.
     The study provides an extensive overview of using computer vision to evaluate grain and fertilizer granule quality, highlighting granule size’s significance. It assesses prevailing object detection methods, outlining their advantages and drawbacks.
     The paper identifies the prevailing trend of framing quality inspection as an image classification challenge and suggests future research directions. These involve exploring object detection, image segmentation, or hybrid models to enhance fertilizer granule quality assessment.

Ключевые слова:
Quality control, computer vision, machine vision, machine learning, grains, fertilizer granules.

Citation:
Ndukwe IK, Yunovidov D, Bahrami MR, Mazzara M, Olugbade TO. Quality inspection of fertilizer granules using computer vision - a review. Computer Optics 2025; 49(1): 84-94. DOI: 10.18287/2412-6179-CO-1458.

References:

  1. Hignett TP, ed. Fertilizer manual. Dordrecht: Springer Science & Business Media; 1985. ISBN: 978-90-481-8290-9.
  2. UNIDO and International Fertilizer Development Center, eds. Fertilizer manual. 3rd ed. Dordrecht: Kluwer Academic Publishers; 1998. ISBN: 0-7923-5032-4.
  3. Laucka A, Adaskeviciute V, Andriukaitis D. Research of the equipment self-calibration methods for different shape fertilizers particles distribution by size using image processing measurement method. Symmetry 2019; 11(7): 838. DOI: 10.3390/sym11070838.
  4. Laucka A, Andriukaitis D, Valinevicius A, Navikas D, Zilys M, Markevicius V, Klimenta D, Sotner R, Jerabek J. Method for volume of irregular shape pellets estimation using 2D imaging measurement. Appl Sci 2020; 10(8): 2650. DOI: 10.3390/app10082650.
  5. Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Communications of the ACM 2017; 60(6): 84-90. DOI: 10.1145/3065386.
  6. Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv Preprint. 2015. Source: <https://arxiv.org/abs/1409.1556>. DOI: 10.48550/arXiv:1409.1556.
  7. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. 2016 IEEE Conf on Computer Vision and Pattern Recognition (CVPR) 2016: 770-778. DOI: 10.1109/CVPR.2016.90.
  8. Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H. MobileNets: Efficient convolutional neural networks for mobile vision applications. arXiv Preprint. 2017. Source: <https://arxiv.org/abs/1704.04861>. DOI: 10.48550/arXiv:1704.04861.
  9. Girshick R. Fast R-CNN. 2015 IEEE Int Conf on Computer Vision (ICCV) 2015: 1440-1448. DOI: 10.1109/ICCV.2015.169.
  10. Girshick R, Donahue J, Darrell T, Malik J. Region-based convolutional networks for accurate object detection and segmentation. IEEE Trans Pattern Anal Mach Intell 2016; 38(1): 142-158. DOI: 10.1109/TPAMI.2015.2437384.
  11. Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. In 2016 IEEE Conf on Computer Vision and Pattern Recognition (CVPR) 2016: 779-788. DOI: 10.1109/CVPR.2016.91.
  12. Carion N, Massa F, Synnaeve G, Usunier N, Kirillov A, Zagoruyko S. End-to-end object detection with transformers. In Book: Vedaldi A, Bischof H, Brox T, Frahm JM, eds. Computer Vision – ECCV 2020. 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part I. Cham: Springer International Publishing; 2020: 213-229. DOI: 10.1007/978-3-030-58452-8_13.
  13. Liu W, Tao Y, Siebenmorgen TJ, Chen H. Digital image analysis method for rapid measurement of rice degree of milling. Cereal Chem 1998; 75(3): 380-385. DOI: 10.1094/CCHEM.1998.75.3.380.
  14. Wan YN, Lin CM, Chiou JF. Rice quality classification using an automatic grain quality inspection system. Trans ASAE 2002; 45(2): 379-387. DOI: 10.13031/2013.8509.
  15. Lloyd BJ, Cnossen AG, Siebenmorgen TJ. Evaluation of two methods for separating head rice from brokens for head rice yield determination. Appl Eng Agric 2001; 17(5): 643-648. DOI: 10.13031/2013.6902.
  16. Yadav BK, Jindal VK. Monitoring milling quality of rice by image analysis. Comput Electron Agr 2001; 33(1): 19-33. DOI: 10.1016/S0168-1699(01)00169-7.
  17. Lan Y, Fang Q, Kocher MF, Hanna MA. Detection of fissures in rice grains using imaging enhancement. Int J Food Prop 2002; 5(1): 205-215. DOI: 10.1081/JFP-120015602.
  18. Van Dalen G. Determination of the size distribution and percentage of broken kernels of rice using flatbed scanning and image analysis. Food Res Int 2004; 37(1): 51-58. DOI: 10.1016/j.foodres.2003.09.001.
  19. Marini F, Zupan J, Magrì AL. On the use of counterpropagation artificial neural networks to characterize Italian rice varieties. Anal Chim Acta 2004; 510(2): 231-240. DOI: 10.1016/j.aca.2004.01.009.
  20. Guzman JD, Peralta E, Nagatsuka T, Ninomiya S. Classification of Philippine rice grains using machine vision and artificial neural networks. World Conference on Agricultural Information and IT (IAALD AFITA WCCA2008) 2008: 41-48.
  21. Aggarwal AK, Mohan R. Aspect ratio analysis using image processing for rice grain quality. Int J Food Eng 2010; 6(5): 8. DOI: 10.2202/1556-3758.1788.
  22. Shantaiya S, Ansari U. Identification of food grains and its quality using pattern classification. Int J Comput Commun Technol 2012; 3(1): 15-19. DOI: 10.47893/IJCCT.2012.1107.
  23. Tated K, Morade S. Application of image processing for automatic cleaning of rice. Proc 1st Int Conf on Recent Trends in Engineering & Technology 2012; 215-217.
  24. Kaur H, Singh B. Classification and grading rice using multi-class SVM. International Journal of Scientific and Research Publications 2013; 3(4): 1-5. Source: <https://www.ijsrp.org/research-paper-0413/ijsrp-p16112.pdf>.
  25. Golpour I, Parian JA, Chayjan RA. Identification and classification of bulk paddy, brown, and white rice cultivars with colour features extraction using image analysis and neural network. Czech J Food Sci 2014; 32(3): 280-287. DOI: 10.17221/238/2013-CJFS.
  26. Azman N, Khairunniza-Bejo S, Ismail WIW, Wayayok A. Estimating maturity of paddy using RGB colour space. J Adv Agr Technol 2014; 1(2): 119-124. DOI: 10.12720/joaat.1.2.119-124.
  27. Anami BS, Naveen NM, Hanamaratti NG. Behavior of HSI color co-occurrence features in variety recognition from bulk paddy grain image samples. IJIGSP 2015; 8(4): 19-30. DOI: 10.14257/ijsip.2015.8.4.02.
  28. Singh KR, Chaudhury S. Efficient technique for rice grain classification using back-propagation neural network and wavelet decomposition. IET Comput Vis 2016; 10(8): 780-787. DOI: 10.1049/iet-cvi.2015.0486.
  29. Sun K, Wang Z, Tu K, Wang S, Pan L. Recognition of mould colony on unhulled paddy based on computer vision using conventional machine-learning and deep learning techniques. Sci Rep 2016; 6(1): 37994. DOI: 10.1038/srep37994.
  30. Ni B, Paulsen MR, Reid JF. Size grading of corn kernels with machine vision. Appl Eng Agr 1998; 14(5): 567-571. DOI: 10.13031/2013.19408.
  31. Ng HF, Wilcke WF, Morey RV, Lang JP. Machine vision evaluation of corn kernel mechanical and mold damage. Trans ASAE 1998; 41(2): 415-420. DOI: 10.13031/2013.17166.
  32. Steenhoek LW, Misra MK, Hurburgh Jr. CR, Bern CJ. Implementing a computer vision system for corn kernel damage evaluation. Appl Eng Agr 2001; 17(2): 235-240. DOI: 10.13031/2013.5448.
  33. Liu J, Paulsen MR. Corn whiteness measurement and classification using machine vision. Trans ASAE 2000; 43(3): 757-763. DOI: 10.13031/2013.2759.
  34. Xie W, Paulsen MR. Machine vision detection of tetrazolium staining in corn. Trans ASAE 2001; 44(2): 421-428. DOI: 10.13031/2013.4670.
  35. Altuntaᶊ Y, Cömert Z, Kocamaz AF. Identification of haploid and diploid maize seeds using convolutional neural networks and a transfer learning approach. Comput Electron Agr 2019; 163: 104874. DOI: 10.1016/j.compag.2019.104874.
  36. Velesaca HO, Mira R, Suarez PL, Larrea CX, Sappa AD. Deep learning based Corn Kernel Classification. 2020 IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops (CVPRW) 2020: 294-302. DOI: 10.1109/CVPRW50498.2020.00041.
  37. Zayas IY, Martin CR, Steele JL, Katsevich A. Wheat classification using image analysis and crush-force parameters. Trans ASAE 1996; 39(6): 2199-2204. DOI: 10.13031/2013.27725.
  38. Ruan R, Ning S, Song A, Ning A, Jones R, Chen P. Estimation of Fusarium scab in wheat using machine vision and a neural network. Cereal Chem 1998; 75(4): 455-459. DOI: 10.1094/CCHEM.1998.75.4.455.
  39. Luo X, Jayas DS, Symons SJ. Identification of damaged kernels in wheat using a colour machine vision system. J Cereal Sci 1999; 30(1): 49-59. DOI: 10.1006/jcrs.1998.0240.
  40. Utku H. Application of the feature selection method to discriminate digitized wheat varieties. J Food Eng 2000; 46(3): 211-216. DOI: 10.1016/S0260-8774(00)00075-3.
  41. Ridgway C, Davies ER, Chambers J, Mason DR, Bateman M. AE-Automation and emerging technologies: Rapid machine vision method for the detection of insects and other particulate bio-contaminants of bulk grain in transit. Biosyst Eng 2002; 83(1): 21-30. DOI: 10.1006/bioe.2002.0096.
  42. Dubey BP, Bhagwat SG, Shouche SP, Sainis JK. Potential of artificial neural networks in varietal identification using morphometry of wheat grains. Biosyst Eng 2006; 95(1): 61-67. DOI: 10.1016/j.biosystemseng.2006.06.001.
  43. Serranti S, Cesare D, Bonifazi G. The development of a hyperspectral imaging method for the detection of Fusarium-damaged, yellow berry and vitreous Italian durum wheat kernels. Biosyst Eng 2013; 115(1): 20-30. DOI: 10.1016/j.biosystemseng.2013.01.011.
  44. Zapotoczny P. Discrimination of wheat grain varieties using image analysis and multidimensional analysis texture of grain mass. Int J Food Prop 2014; 17(1): 139-151. DOI: 10.1080/10942912.2011.615085.
  45. Ebrahimi E, Mollazade K, Babaei S. Toward an automatic wheat purity measuring device: A machine vision-based neural networks-assisted imperialist competitive algorithm approach. Measurement 2014; 55: 196-205. DOI: 10.1016/j.measurement.2014.05.003.
  46. Jirsa O, Poliśenská I. Identification of Fusarium damaged wheat kernels using image analysis. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis 2014; 59(5): 125-130. DOI: 10.11118/actaun201159050125.
  47. Olgun M, Onarcan AO, Özkan K, Iᶊik S, Sezer O, Özgiᶊi K, Ayter NG, Baᶊçiftçi ZB, Ardiç M, Koyuncu O. Wheat grain classification by using dense SIFT features with SVM classifier. Comput Electron Agr 2016; 122: 185-190. DOI: 10.1016/j.compag.2016.01.033.
  48. Sabanci K, Kayabasi A, Toktas A. Computer vision-based method for classification of wheat grains using artificial neural network: Computer vision-based method for classification. J Sci Food Agr 2017; 97(8): 2588-2593. DOI: 10.1002/jsfa.8080.
  49. Shahin MA, Symons SJ. A machine vision system for grading lentils. Can Biosyst Eng 2001; 43: 7.7-7.14. Source: <https://library.csbe-scgab.ca/docs/journal/43/c0019.pdf>.
  50. Shahin MA, Symons SJ. Lentil type identification using machine vision. Can Biosyst Eng 2003; 45: 3-5. Source: <https://library.csbe-scgab.ca/docs/journal/45/c0148.pdf>.
  51. Huang NF, Chou DL, Lee CA. Real-time classification of green coffee beans by using a convolutional neural network. 2019 3rd Int Conf on Imaging, Signal Processing and Communication (ICISPC) 2019: 107-111. DOI: 10.1109/ICISPC.2019.8935644.
  52. Paliwal J, Visen NS, Jayas DS, White NDG. Cereal grain and dockage identification using machine vision. Biosyst Eng 2003; 85(1): 51-57. DOI: 10.1016/S1537-5110(03)00034-5.
  53. Visen NS, Paliwal J, Jayas D, White NDG. Image analysis of bulk grain samples using neural networks. Can Biosyst Eng 2004; 46: 7.11-1.15. DOI: 10.13031/2013.15002.
  54. Lee CY, Yan L, Wang T, Lee SR, Park CW. Intelligent classification methods of grain kernels using computer vision analysis. Meas Sci Technol 2011; 22(6): 064006. DOI: 10.1088/0957-0233/22/6/064006.
  55. Gunasekaran S, Cooper TM, Berlage AG. Soybean seed coat and cotyledon crack detection by image processing. J Agr Eng Res 1988; 41(2): 139-148. DOI: 10.1016/0021-8634(88)90195-3.
  56. Ahmad IS, Reid JF, Paulsen MR, Sinclair JB. Color classifier for symptomatic soybean seeds using image processing. Plant Disease 1999; 83(4): 320-327. DOI: 10.1094/PDIS.1999.83.4.320.
  57. Shahin MA, Symons SJ, Poysa VW. Determining soya bean seed size uniformity with image analysis. Biosyst Eng 2006; 94(2): 191-198. DOI: 10.1016/j.biosystemseng.2006.02.011.
  58. Kezhu T, Yuhua C, Weixian S, Xiaoda C. Identification of diseases for soybean seeds by computer vision applying BP neural network. Int J Agr Biol Eng 2014; 7(3): 43-50. DOI: 10.3965/j.ijabe.20140703.006.
  59. ISO 13322-1:2014. Particle size analysis – Image analysis methods – Part 1: Static image analysis methods. Geneva, Switzerland: ISO copyright office; 2014.
  60. Yunovidov D, Menshikov K, Sidorova E. Robotic control system for particles size distribution of industrially produced mineral fertilizers. Int J Mech Eng Robot Res 2020; 9(12): 1560-1565. DOI: 10.18178/ijmerr.9.12.1560-1565.
  61. Yunovidov DV, Shabalov VA, Menshikov KA. Robotic system of optical control and data acquisition for analyzing the physical properties of industrially produced mineral fertilizer granules. 2021 XV Int Sci-Tech Conf on Actual Problems of Electronic Instrument Engineering (APEIE) 2021: 1-5. DOI: 10.1109/APEIE52976.2021.9647695.
  62. Buscombe D. SediNet: a configurable deep learning model for mixed qualitative and quantitative optical granulometry. Earth Surf Process Landf 2020; 45(3): 638-651. DOI: 10.1002/esp.4760.
  63. Ebner M, Geldmacher F, Marone F, Stampanoni M, Wood V. X-ray tomography of Porous, Transition Metal Oxide based Lithium Ion Battery Electrodes. Adv Energy Mater 2013; 3(7): 845-850. DOI: 10.1002/aenm.201200932.
  64. Koklu M, Cinar I, Taspinar YS. Classification of rice varieties with deep learning methods. Comput Electron Agr 2021; 187: 106285. DOI: 10.1016/j.compag.2021.106285.
  65. Koklu M, Ozkan IA. Multiclass classification of dry beans using computer vision and machine learning techniques. Comput Electron Agr 2020; 174: 105507. DOI: 10.1016/j.compag.2020.105507.
  66. Çinar I, Koklu M, Taᶊdemir S. Classification of raisin grains using machine vision and artificial intelligence methods. Gazi J Eng Sci 2020; 6(3): 200-209. DOI: 10.30855/gmbd.2020.03.03.
  67. Zou Z, Chen K, Shi Z, Guo Y, Ye J. Object detection in 20 years: A survey. Proc IEEE 2023; 111(3): 257-276. DOI: 10.1109/JPROC.2023.3238524.
  68. Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. Proc 2001 IEEE Computer Society Conf on Computer Vision and Pattern Recognition (CVPR) 2001; 1: I-511-I-518. DOI: 10.1109/CVPR.2001.990517.
  69. Viola P, Jones MJ. Robust real-time face detection. Int J Comput Vis 2004; 57(2): 137-154. DOI: 10.1023/B:VISI.0000013087.49260.fb.
  70. Dalal N, Triggs B. Histograms of oriented gradients for human detection. 2005 IEEE Computer Society Conf on Computer Vision and Pattern Recognition (CVPR’05) 2005; 1: 886-893. DOI: 10.1109/CVPR.2005.177.
  71. Felzenszwalb P, McAllester D, Ramanan D. A discriminatively trained, multiscale, deformable part model. 2008 IEEE Conf on Computer Vision and Pattern Recognition 2008: 1-8. DOI: 10.1109/CVPR.2008.4587597.
  72. Ershov EI, Korchagin SA, Kokhan VV, Bezmaternykh PV. A generalization of Otsu method for linear separation of two unbalanced classes in document image binarization. Computer Optics 2021; 45(1): 66-76. DOI: 10.18287/2412-6179-CO-752.
  73. Bulatov KB, Bezmaternykh PV, Nikolaev DP, Arlazarov VV. Towards a unified framework for identity documents analysis and recognition. Computer Optics 2022; 46(3): 436-454. DOI: 10.18287/2412-6179-CO-1024.
  74. Bulatov KB, Emelianova EV, Tropin DV, Skoryukina NS, Chernyshova YS, Sheshkus AV, Usilin SA, Ming Z, Burie JC, Luqman MM, Arlazarov VV. MIDV-2020: a comprehensive benchmark dataset for identity document analysis. Computer Optics 2022; 46(2): 252-270. DOI: 10.18287/2412-6179-CO-1006.
  75. Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. NIPS'15: Proc 28th Int Conf on Neural Information Processing Systems 2015; 1: 91-99.
  76. Lin T, Dollar P, Girshick R, He K, Hariharan B, Belongie S. Feature pyramid networks for object detection. 2017 IEEE Conf on Computer Vision and Pattern Recognition (CVPR) 2017: 936-944. DOI: 10.1109/CVPR.2017.106.
  77. Liang TJ, Pan WG, Bao H, Pan F. Vehicle wheel weld detection based on improved YOLO v4 algorithm. Computer Optics 2022; 46(2): 271-279. DOI: 10.18287/2412-6179-CO-887.
  78. Chang R, Mao ZX, Hu J, Bai HC, Zhou CJ, Yang Y, Gao S. Research on foreign body detection in transmission lines based on a multi-UAV cooperative system and YOLOv7. Computer Optics 2023; 47(5): 788-794. DOI: 10.18287/2412-6179-CO-1257.
  79. Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu C, Berg AC. SSD: Single shot multi-box detector. In Book: Leibe B, Matas J, Sebe N, Welling M, eds. Computer Vision – ECCV 2016. 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I. Cham: Springer International Publishing; 2016: 21-37. DOI: 10.1007/978-3-319-46448-0_2.
  80. Lin T, Goyal P, Girshick R, He K, Dollàr P. Focal loss for dense object detection. IEEE Trans Pattern Anal Mach Intell 2020; 42(2): 318-327. DOI: 10.1109/TPAMI.2018.2858826.
  81. Law H, Deng J. CornerNet: Detecting objects as paired keypoints. Int J Comput Vis 2020; 128: 642-656. DOI: 10.1007/s11263-019-01204-1.
  82. Jocher G, Chaurasia A, Qiu J. YOLO by Ultralytics. 2023. Source: <https://github.com/ultralytics/ultralytics>.
  83. Wang C, Bochkovskiy A, Liao HM. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. 2023 IEEE/CVF Conf on Computer Vision and Pattern Recognition (CVPR) 2023: 7464-7475. DOI: 10.1109/CVPR52729.2023.00721.
  84. Li C, Li L, Geng Y, Jiang H, Cheng M, Zhang B, Ke Z, Xu X, Chu X. YOLOv6 v3.0: A full-scale reloading. arXiv Preprint. 2023. Source: <https://arxiv.org/abs/2301.05586>. DOI: 10.48550/arXiv.2301.05586.
  85. Lyu C, Zhang W, Huang H, Zhou Y, Wang Y, Liu Y, Zhang S, Chen K. RTMDet: An empirical study of designing real-time object detectors. arXiv Preprint. 2022. Source: <https://arxiv.org/abs/2212.07784>. DOI: 10.48550/arXiv.2212.07784.
  86. Lv W, Xu S, Zhao Y, Wang G, Wei J, Cui C, Du Y, Dang Q, Liu Y. DETRs beat YOLOs on real-time object detection. arXiv Preprint. 2023. Source: <https://arxiv.org/abs/2304.08069>. DOI: 10.48550/arXiv.2304.08069.
  87. Wang CY, Yeh IH, Liao HY. YOLOv9: Learning what you want to learn using programmable gradient information. arXiv Preprint. 2024. Source: <https://arxiv.org/abs/2402.13616>. DOI: 10.48550/arXiv.2402.13616.
  88. He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. 2017 IEEE Int Conf on Computer Vision (ICCV) 2017: 2961-2969. DOI: 10.1109/ICCV.2017.322.
  89. Kirillov A, Mintun E, Ravi N, Mao H, Rolland C, Gustafson L, Xiao T, Whitehead S, Berg AC, Lo W-Y, Dollár P, Girshick R. Segment anything. Proc IEEE/CVF Int Conf on Computer Vision (ICCV) 2023: 4015-4026. DOI: 10.1109/ICCV51070.2023.00371.
  90. Zhao X, Ding W, An Y, Du Y, Yu T, Li M, Tang M, Wang J. Fast segment anything. arXiv Preprint. 2023. Source: <https://arxiv.org/abs/2306.12156>. DOI: 10.48550/arXiv.2306.12156.

© 2009, IPSI RAS
Россия, 443001, Самара, ул. Молодогвардейская, 151; электронная почта: journal@computeroptics.ru; тел: +7 (846) 242-41-24 (ответственный секретарь), +7 (846) 332-56-22 (технический редактор), факс: +7 (846) 332-56-20