(44-3) 12 * << * >> * Russian * English * Content * All Issues

Crop growth monitoring through Sentinel and Landsat data based NDVI time-series
M.S. Boori 1,2,4, K. Choudhary 1,3,4, A.V. Kupriyanov 1,5

Samara National Research University, Moskovskoye Shosse 34, 443086, Samara, Russia,
American Sentinel University, Colorado, USA,
The Hong Kong Polytechnic University, Kowloon, Hong Kong,
University of Rennes 2, Rennes, France,
IPSI RAS – Branch of the FSRC "Crystallography and Photonics" RAS,
Molodogvardeyskaya 151, 443001, Samara, Russia

 PDF, 4312 kB

DOI: 10.18287/2412-6179-CO-635

Pages: 409-419.

Full text of article: English language.

Abstract:
Crop growth monitoring is an important phenomenon for agriculture classification, yield estimation, agriculture field management, improve productivity, irrigation, fertilizer management, sustainable agricultural development, food security and to understand how environment and climate change effect on crops especially in Russia as it has a large and diverse agricultural production. In this study, we assimilated monthly crop phenology from January to December 2018 by using the NDVI time series derived from moderate to high Spatio-temporal resolution Sentinel and Landsat data in cropland field at Samara airport area, Russia. The results support the potential of Sentinel and Landsat data derived NDVI time series for accurate crop phenological monitoring with all crop growth stages such as active tillering, jointing, maturity and harvesting according to crop calendar with reasonable thematic accuracy. This satellite data generated NDVI based work has great potential to provide valuable support for assessing crop growth status and the above-mentioned objectives with sustainable agriculture development.

Keywords:
crop phenology, NDVI time-series, Sentinel-2 & Landsat, remote sensing.

Citation:
Boori MS, Choudhary K, Kupriyanov AV. Crop growth monitoring through Sentinel and Landsat data based NDVI time-series. Computer Optics 2020; 44(3): 409-419. DOI: 10.18287/2412-6179-CO-635.

Acknowledgments:
This work was partially supported by the Ministry of education and science of the Russian Federation in the framework of the implementation of the Program of increasing the competitiveness of Samara University among the world’s leading scientific and educational centers for 2013-2020 years; by the Russian Foundation for Basic Research grants (# 15-29-03823, # 16-41-630761, # 17-01-00972, # 18-37-00418), in the framework of the state task #0026-2018-0102 "Optoinformation technologies for obtaining and processing hyperspectral data".

References:

  1. Boori MS, Choudhary K, Kupriyanov A, Kovelskiy V. Satellite data for Singapore, Manila and Kuala Lumpur city growth analysis. Data in Brief 2016; 7: 1576-1583. DOI: 10.1016/j.dib.2016.04.028.
  2. Griffiths P, Nendel C, Hostert P. Intra-annual reflectance composites from Sentinel-2 and Landsat for national-scale crop and land cover mapping. Remote Sensing of Environment 2019; 220: 135-151.
  3. Berclaz J, Fleuret F, Turetken E, Fua P. Multiple object tracking using k-shortest paths optimization. IEEE Trans Pattern Anal Machine Intell 2011; 33(9): 1806-1819. DOI: 10.1109/TPAMI.2011.21.
  4. Cao Z, Simon T, Wei S-E, Sheikh Y. Realtime multi-person 2D pose estimation using part affinity fields. Proc IEEE Conf Comp Vis Patt Recogn 2017: 7291-7299. DOI: 10.1109/CVPR.2017.143.
  5. Caprile B, Torre V. Using vanishing points for camera calibration. Int J Comp Vis 1990; 4(2): 127-139. DOI: 10.1007/BF00127813.
  6. Delbruck T. Frame-free dynamic digital vision. Proceedings of International Symposium on Secure-Life Electronics, Advanced Electronics for Quality Life and Society 2008: 21-26.
  7. Dubskа́ M, Herout A, Sochor J. Automatic camera calibration for traffic understanding. Proceedings of the British Machine Vision Conference 2014; 4: 8. DOI: 10.5244/C.28.42.
  8. Fleuret F, Berclaz J, Lengagne R, Fua P. Multi-camera people tracking with a probabilistic occupancy map. IEEE Trans Pattern Anal Machine Intell 2008; 30(2): 267-282. DOI: 10.1109/TPAMI.2007.1174.
  9. Hoiem D, Efros AA, Hebert M. Putting objects in perspective. Int J Comp Vis 2008; 80(1): 3-15. DOI: 10.1109/CVPR.2006.232.
  10. Hold-Geoffroy Y,  Sunkavalli K, Eisenmann J, Fisher M, Gambaretto E, Hadap S, Lalonde J-F. A perceptual measure for deep single image camera calibration. Proc IEEE Conf Comp Vis Pattern Recogn 2018: 2354-2363. DOI: 10.1109/CVPR.2018.00250.
  11. Huang S, Ying X, Rong J, Shang Z, Zha H. Camera calibration from periodic motion of a pedestrian. Proc IEEE Conf Comp Vis Pattern Recogn 2016: 3025-3033. DOI: 10.1109/CVPR.2016.330.
  12. Li B, Peng K, Ying X, Zha H. Simultaneous vanishing point detection and camera calibration from single images. In Book: Bebis G, Boyle R, Parvin B, Koracin D, Chung R, Hammound R, Hussain M, Kar-Han T, Crawfis R, Thalmann D, Kao D, Avila L, eds. Advances in visual computing. Berlin, Heidelberg: Springer-Verlag; 2010: 151-160. DOI: 10.1007/978-3-642-17274-8_15.
  13. Li S, Nguyen VH, Ma M, Jin C-B, Do TD, Kim H. A simplified nonlinear regression method for human height estimation in video surveillance. EURASIP Journal on Image and Video Processing 2015; 2015(1): 32. DOI: 10.1186/s13640-015-0086-1.
  14. Liu J, Collins RT, Liu Y. Surveillance camera autocalibration based on pedestrian height distributions. British Machine Vision Conference (BMVC) 2011; 2: 117. DOI: 10.5244/C.25.117.
  15. Liu J, Collins RT, Liu Y. Robust autocalibration for a surveillance camera network. 2013 IEEE Workshop on Applications of Computer Vision (WACV) 2013: 433-440. DOI: 10.1109/WACV.2013.6475051.
  16. Lv F, Zhao T, Nevatia R. Self-calibration of a camera from video of a walking human. Object Recognition Supported by User Interaction for Service Robots 2002; 1: 562-567. DOI: 10.1109/ICPR.2002.1044793.
  17. Mueggler E. Event-based vision for high-speed robotics. PhD thesis. University of Zurich; 2017.
  18. Pricasariu V, Reid I. FastHOG-a real-time GPU implementation of HOG. Source: <http://www.robots.ox.ac.uk/~lav/Papers/prisacariu_reid_tr2310_09/prisacariu_reid_tr2310_09.pdf>.
  19. Shalnov E, Konushin A. Convolutional neural network for camera pose estimation from object detections. International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences 2017: 1-6. DOI: 10.5194/isprs-archives-XLII-2-W4-1-2017.
  20. Sochor J, Jurа́nek R, Herout A. Traffic surveillance camera calibration by 3D model bounding box alignment for accurate vehicle speed measurement. Computer Vision and Image Understanding 2017; 161: 87-98. DOI: 10.1016/j.cviu.2017.05.015.
  21. Tsai RY. An efficient and accurate camera calibration technique for 3D machine vision. Proc IEEE Conf Comp Vis Pattern Recogn 1986: 364-374.
  22. Wei S-E, Ramakrishna V, Kanade T, Sheikh Y. Convolutional pose machines. Proc IEEE Conf Comp Vis Pattern Recogn 2016: 4724-4732. DOI: 10.1109/CVPR.2016.511.
  23. Workman S, Greenwell C, Zhai M, Baltenberger R, Jacobs N. DEEPFOCAL: A method for direct focal length estimation. IEEE Int Conf Image Process (ICIP) 2015: 1369-1373. DOI: 10.1109/ICIP.2015.7351024.
  24. Workman S, Zhai M, Jacobs N. Horizon lines in the wild. Proc British Machine Vis Conf 2016: 20. DOI: 10.5244/c.30.20.
  25. Yan H, Zhang Y, Zhang S, Zhao S, Zhang L. Focal length estimation guided with object distribution on FocaLens dataset. Journal of Electronic Imaging 2017; 26(3): 033018. DOI: 10.1117/1.JEI.26.3.033018.
  26. Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal Machine Intell 2000; 22: 1330-1334. DOI: 10.1109/34.888718.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: ko@smr.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20