(45-4) 16 * << * >> * Русский * English * Содержание * Все выпуски

One-shot learning with triplet loss for vegetation classification tasks
A.V. Uzhinskiy 1,2, G.A. Ososkov 1, P.V. Goncharov 1, A.V. Nechaevskiy 1,2, A.A. Smetanin 3

Joint Institute for Nuclear Research, 141980, Russia, Dubna, Joliot-Curie 6,
Russian State Agrarian University - Moscow Timiryazev Agricultural Academy,
Russia, Moscow, Timiryazevskaya st., 49,
National Research University ITMO, 197101, Russia, Saint–Petersburg, Kronverkskiy pr., 49

 PDF, 1896 kB

DOI: 10.18287/2412-6179-CO-856

Страницы: 608-614.

Язык статьи: English

Аннотация:
Triplet loss function is one of the options that can significantly improve the accuracy of the One-shot Learning tasks. Starting from 2015, many projects use Siamese networks and this kind of loss for face recognition and object classification. In our research, we focused on two tasks related to vegetation. The first one is plant disease detection on 25 classes of five crops (grape, cotton, wheat, cucumbers, and corn). This task is motivated because harvest losses due to diseases is a serious problem for both large farming structures and rural families. The second task is the identification of moss species (5 classes). Mosses are natural bioaccumulators of pollutants; therefore, they are used in environmental monitoring programs. The identification of moss species is an important step in the sample preprocessing. In both tasks, we used self-collected image databases. We tried several deep learning architectures and approaches. Our Siamese network architecture with a triplet loss function and MobileNetV2 as a base network showed the most impressive results in both above-mentioned tasks. The average accuracy for plant disease detection amounted to over 97.8% and 97.6% for moss species classification.

Ключевые слова:
deep neural networks; siamese networks; triplet loss; plant diseases detection; moss species classification.

Благодарности
A.V.U. and A.V.N. gratefully acknowledge financial support from the Ministry of Science and Higher Education of the Russian Federation in accordance with agreement No 075-15-2020-905 dated November 16, 2020 on providing a grant in the form of subsidies from the Federal budget of Russian Federation. The grant was provided for state support for the creation and development of a World-class Scientific Center "Agrotechnologies for the Future". The database creation part of the reported study was funded by RFBR according to the research project No 18-07-00829.

Citation:
Uzhinskiy AV, Ososkov GA, Goncharov PV, Nechaevskiy AV, Smetanin AA. One-shot learning with triplet loss for vegetation classification tasks. Computer Optics 2021; 45(4): 608-614. DOI: 10.18287/2412-6179-CO-856.

Литература:

  1. Uzhinskiy AV, OsoskovGA, Goncharov PV, Nechaevskiy AV. Multifunctional platform and mobile application for plant disease detection. CEUR Workshop Proc 2019; 2507: 110-114.
  2. Goncharov P, Ososkov G, Nechaevskiy A, Uzhinskiy A, Nestsiarenia I. Disease detection on the plant leaves by deep learning. In Book: Kryzhanovsky B, Dunin-Barkowski W, Redko V, Tiumentsev Y, eds. Advances in Neural Computation, Machine Learning, and Cognitive II. Cham, Switzerland: Springer Nature Switzerland AG; 2019:151-159.
  3. Goncharov P, Uzhinskiy A, Ososkov G, Nechaevskiy A, Zudikhina J. Deep siamese networks for plant disease detection. EPJ Web of Conferences 2020; 226: 03010.
  4. Schroff F, Kalenichenko D, Philbin J. FaceNet: A unified embedding for face recognition and clustering. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2015: 815-823. DOI: 10.1109/CVPR.2015.7298682.
  5. Uzhinskiy A, Ososkov G, Frontasieva M. Management of environmental monitoring data: UNECE ICP Vegetation case. CEUR Workshop Proc 2019; 2507: 202-207.
  6. Hughes D, Salathé M. An open access repository of images on plant health to enable the development of mobile disease diagnostics through machine learning and crowdsourcing. arXiv Preprint 2015. Source: <https://arxiv.org/abs/1511.08060>.
  7. Mohanty S, Hughes D, Salathé M. Using deep learning for image-based plant disease detection. Front Plant Sci 2016; 7: 1419.
  8. Too EC, Yujian L, Njuki S, Yingchun L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput Electron Agric 2019; 161: 272-279.
  9. Ferentinos KP. Deep learning models for plant disease detection and diagnosis. Comput Electron Agric 2018; 145: 311-318.
  10. Fuentes A, Yoon S, Kim S, Park D. A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition. Sensors 2017; 17: 2022.
  11. Türkoğlu M, Hanbay D. Plant disease and pest detection using deep learning-based features. Turk J Elec Eng & Comp Sci 2019; 27: 1636-1651.
  12. Selvaraj M, Vergara A, Ruiz H, Safari N, Elayabalan S, Ocimati W, Blomme G. AI-powered banana diseases and pest detection. Plant Methods 2019; 15: 92.
  13. Saleem M, Potgieter J, Arif K. Plant disease detection and classification by deep learning. Plants 2019; 8: 468.
  14. Ise T, Minagawa M, Onishi M. Classifying 3 moss species by deep learning, using the “chopped picture” method. Open J Ecol 2018; 8: 166-173.
  15. Cheng D, Gong Y, Zhou S, Wang J, Zheng N. Person re-identification by multi-channelparts-based cnn with improved triplet loss function. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016: 1335-1344.
  16. Hermans A, Beyer L, Leibe B. In defense of the triplet loss for person re-identification. arXiv preprint. Source: <https://arxiv.org/abs/1703.07737>.
  17. Dong X, Shen J. Triplet loss in Siamese network for object tracking. Proc European Conference on Computer Vision (ECCV) 2018; 459-474.
  18. Puch S, Sánchez I, Rowe M. Few-shot learning with deep triplet networks for brain imaging modality recognition. In Book: Wang Q, Milletari F, Nguyen HV, Albarqouni S, Jorge Cardoso M, Rieke N, Xu Z, Kamnitsas K, Patel V, Roysam B, Jiang S, Zhou K, Luu K, Le N, eds. Domain adaptation and representation transfer and medical image learning with less labels and imperfect data. Springer; 2019.
  19. Anshul T, Daksh T, Padmanabhan R, Aditya N. Deep metric learning for bioacoustic classification: Overcoming training data scarcity using dynamic triplet loss. J Acoust Soc Am 2019; 146: 534-547.
  20. Zhang J, Lu C, Wang J, Yue X, Lim S, Al-Makhadmeh Z, Tolba A. Training convolutional neural networks with multi-size images and triplet loss for remote sensing scene classification. Sensors 2020; 20(4): 1188.

© 2009, IPSI RAS
Россия, 443001, Самара, ул. Молодогвардейская, 151; электронная почта: journal@computeroptics.ru; тел: +7 (846) 242-41-24 (ответственный секретарь), +7 (846) 332-56-22 (технический редактор), факс: +7 (846) 332-56-20