(45-6) 13 * << * >> * Russian * English * Content * All Issues

Neural network-aided classification of hyperspectral vegetation images with a training sample generated using an adaptive vegetation index
N. Firsov 1, V. Podlipnov 1,2, N. Ivliev 1,2, P. Nikolaev 3, S. Mashkov 4, P. Ishkin 4, R. Skidanov 1,2, A. Nikonorov 1

Samara National Research University, 443086, Samara, Russia, Moskovskoye Shosse 34,
IPSI RAS – Branch of FSRC "Crystallography and Photonics" RAS,
443001, Samara, Russia, Molodogvardeyskaya 151,

Institute for Information Transmission Problems, RAS,
127051, Moscow, Russia, Bolshoy Karetny per. 19, build. 1,

Samara State Agrarian University, 446442, Usty-Kinelyskiy, Russia, Uchebnaya 2

 PDF, 8572 kB

DOI: 10.18287/2412-6179-CO-1038

Pages: 887-896.

Full text of article: Russian language.

Abstract:
In this paper, we propose an approach to the classification of high-resolution hyperspectral images in the applied problem of identification of vegetation types. A modified spectral-spatial convolutional neural network with compensation for illumination variations is used as a classifier. For generating a training dataset, an algorithm based on an adaptive vegetation index is proposed. The effectiveness of the proposed approach is shown on the basis of survey data of agricultural lands obtained from a compact hyperspectral camera developed in-house.

Keywords:
hyperspectral images, vegetation index, convolutional neural networks.

Citation:
Firsov NA, Podlipnov VV, Ivliev NA, Nikolaev PP, Mashkov SV, Ishkin PA, Skidanov RV, Nikonorov AV. Neural network-aided classification of hyperspectral vegetation images with a training sample generated using an adaptive vegetation index. Computer Optics 2021; 45(6): 887-896. DOI: 10.18287/2412-6179-CO-1038.

Acknowledgements:
The theoretical part and neural network models were developed with the support from the Russian Science Foundation under RSF grant 20-69-47110. The experimental part was executed with the support from the Russian Foundation for Basic Research under the government project of the IPSI RAS – a branch of the Federal Scientific-Research Center "Crystallography and Photonics" of the RAS (agreement № 007-ГЗ/Ч3363/26). The authors are grateful to E.P. Tsirulev, N.V. Borovkova and A.A. Solovyov for their help in the field work.

References:

  1. Sharma V, Diba A, Tuytelaar T, Gool LV. Hyperspectral CNN for image classification & band selection, with application to face recognition. 2016. Source: <https://core.ac.uk/download/pdf/80805922.pdf>.
  2. Zhang J, Cheng T, Guo W, Xu X, Qiao H, Xie Y, Ma X. Leaf area index estimation model for UAV image hyperspectral data based on wavelength variable selection and machine learning methods. Plant Methods 2021; 17(1): 49-54.
  3. Siedliska A, Baranowski P, Pastuszka- Woźniak J, Zubik M, Krzyszczak J. Identification of plant leaf phosphorus content at different growth stages based on hyperspectral reflectance. BMC Plant Biol 2021; 21(1): 28-32.
  4. Sahadevan AS. Extraction of spatial-spectral homogeneous patches and fractional abundances for field-scale agriculture monitoring using airborne hyperspectral images. Comput Electron Agric 2021; 188: 106325.
  5. Zhang Y, Xia C, Zhang X, Cheng X, Feng G, Wang Y, Gao Q. Estimating the maize biomass by crop height and narrowband vegetation indices derived from UAV-based hyperspectral images. Ecol Indic 2021; 129: 107985.
  6. La Rosa LEC, Sothe C, Feitosa RQ, de Almeida CM, Schimalski MB, Oliveira DAB. Multi-task fully convolutional network for tree species mapping in dense forests using small training hyperspectral data. ISPRS J Photogramm Remote Sens 2021; 179: 35-49.
  7. Wang L, Chen S, Li D, Wang C, Jiang H, Zheng Q, Peng Z. Estimation of paddy rice nitrogen content and accumulation both at leaf and plant levels from UAV hyperspectral imagery. Remote Sens 2021; 13(15): 2956.
  8. Vangi E, D’amico G, Francini S, Giannetti F, Lasserre B, Marchetti M, Chirici G. The new hyperspectral satellite PRISMA: Imagery for forest types discrimination. Sensors 2021; 21(4): 1182.
  9. Pereira JFQ, Pimentel MF, Amigo JM, Honorato RS. Detection and identification of Cannabis sativa L. using near infrared hyperspectral imaging and machine learning methods. Spectrochim Acta A Mol Biomol Spectrosc 2020; 237: 118385.
  10. Ferreira A, Felipussi SC, Pires R, Avila S, Santos G, Lambert J, Huang J, Rocha A. Eyes in the skies: A data-driven fusion approach to identifying drug crops from remote sensing images. IEEE J Sel Top Appl Earth Obs Remote Sens 2019; 12(12): 4773-4786.
  11. Barton IF, Gabriel MJ, Lyons-Baral J, Barton MD, Duplessis L, Roberts C. Extending geometallurgy to the mine scale with hyperspectral imaging: a pilot study using drone- and ground-based scanning. Mining, Metallurgy and Exploration 2021; 38(2): 799-818.
  12. Degerick J, Hermy M, Somers B. Mapping functional urban green types using high resolution remote sensing data. Sustainability 2020; 12(5): 2144.
  13. Huang H, Sun Z, Liu S, Di Y, Xu J, Liu C, Xu R, Song H, Zhan S, Wo J. Underwater hyperspectral imaging for in situ underwater microplastic detection. Sci Total Environ 2021; 776: 145960.
  14. Claudio HC, Cheng Y, Fuentes DA, Gamon JA, Luo H, Oechel W, Sims DA. Monitoring drought effects on vegetation water content and fluxes in chaparral with the 970 nm water band index. Remote Sens Environ 2006; 103(3): 304-311.
  15. Mahajan GR, Sahoo RN, Pandey RN, Gupta VK, Kumar D. Using hyperspectral remote sensing techniques to monitor nitrogen, phosphorus, sulphur and potassium in wheat (Triticum aestivum L.). Precis Agric 2014; 15(5): 499-522.
  16. Liu B, Yu X, Zhang P, Tan X, Yu A, Zue Z. A semi-supervised convolutional neural network for hyperspectral image classification. Remote Sens Lett 2017; 8: 839-848.
  17. Bioucas-Dias JM, Plaza A, Camps-Valls G, Scheunders P, Nascrabadi N, Chanussot J. Hyperspectral remote sensing data analysis and future challenges. IEEE Trans Geosci Remote Sens 2013; 1: 6-36.
  18. He M, Li B, Chen H. Multi-scale 3D deep convolutional neural network for hyperspectral image classification. IEEE Int Conf on Image Processing (ICIP) 2017: 3904-3908.
  19. Jung A, Kardevan P, Tökei L. Hyperspectral technology in vegetation analysis. Prog Agric Eng Sci 2006; 2(1): 95-117.
  20. Kwan C, Gribben D, Ayhan B, Li J, Bernage S, Plaza A. An accurate vegetation and non-vegetation differentiation approach based on land cover classification. Remote Sensors 2020; 12(23); 3880-3888.
  21. Hu W, Huang Y, Wei L, Zhang F, Li H. Deep convolutional neural networks for hyperspectral image classification. J Sens 2015; 2015; 30-42.
  22. Nikonorov A, Bibikov S, Yakimov P, Fursov V. Spectrum shape elements model to correct color and hyperspectral images. 8th IAPR Workshop on Pattern Reconition in Remote Sensing 2014: 1-4. DOI: 10.1109/PRRS.2014.6914282.
  23. 23. Nikonorov A, Petrov M, Bibikov S, Kutikova V, Yakimov P, Morozov A. Deep learning-based enhancement of hyperspectral images using simulated ground truth. 10th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS) 2018: 1-9. DOI: 10.1109/PRRS.2018.8486408.
  24. Nikonorov A, Bibikov S, Myasnikov V, Yuzifovich Y, Fursov V. Correcting color and hyperspectral images with identification of distortion model. Pattern Recognit Lett 2016; 83(P2): 178-187. DOI: 10.1016/j.patrec.2016.06.027.
  25. Adão T, Hruška J,Pádua L, Bessa J, Peres J, Morais R, Sousa JJ. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens 2017; 9(11): 1110.
  26. Li Y, Zhang H, Shen Q. Spectral–spatial classification of hyperspectral imagery with 3D convolutional neural network. Remote Sensors 2017; 9(1); 67-72.
  27. Chen Y, Jiang H, Li C, Jia X, Ghamisi P. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans Geosci Remote Sens 2016; 54(10); 6232-6251.
  28. Xiu Q, Yuan X, Ouyang C, Zeng Y. Attention-based pyramid network for segmentation and classification of high-resolution and hyperspectral remote sensing images. Remote Sensors 2020; 12(21): 3501-3507.
  29. Dobigen N, Altmann Y, Brun N, Moussaoui S. Linear and nonlinear unmixing in hyperspectral imaging. Data Handl Sci Technol 2016; 30: 185-224.
  30. Kale KV, Solankar MM, Nalawade DB. Hyperspectral endmember extraction techniques. In Book: Chen J, Song Y, Li H, eds. Processing and analysis of hyperspectral data. IntechOpen; 2019.
  31. Berk A, Conforti P, Kennet R, Perkins T, Hawes F, van den Bosh J. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code. 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing 2014: 1-4.
  32. Podlipnov V, Shchedrin V, Babichev A, Vasilyev S, Blank V. Experimental determination of soil moisture on hyperspectral images. Computer Optics 2018; 42(5): 877-884. DOI: 10.18287/2412-6179-2017-42-5-877-884.
  33. Karpeev S, Khonina S, Murdagulov A, Petrov M. Alignment and study of prototypes of the Offner Hyperspectrometer. Vestnik of the Samara State Aerospace University 2016; 15(1): 197-206. DOI: 10.18287/2412-7329-2016-15-1-197-206.
  34. Manea D, Calin MA. Hyperspectral imaging in different light conditions. Imaging Sci J 2015; 63: 214-219.
  35. van de Weijer J, Gevers T. Color constancy based on the Grey-edge hypothesis. IEEE Int Conf on Image Processing 2005; II: 722-725.
  36. Cai J, Chang Q, Tang X-L, Xue C, Wei C. Facial expression recognition method based on sparse batch normalization CNN. 37th Chinese Control Conference (CCC) 2018: 9608-9613.
  37. Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. Source: <https://arxiv.org/abs/1502.03167>.
  38. Luo Y, Zou J, Yao C, Li T, Bai G. HSI-CNN: A novel convolution neural network for hyperspectral image. Int Conf on Audio, Language and Image Processing (ICALIP) 2019: 464-469.
  39. Ben Hamida A, Benoit A, Lambert P, Ben Amar C. 3-D deep learning approach for remote sensing image classification. IEEE Trans Geosci Remote Sens 2018; 56(8): 4420-4434
  1. .

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20