(47-5) 10 * << * >> * Russian * English * Content * All Issues

Performance evaluation of underwater vision systems
V.Y. Kolyuchkin 1, N.M. Kostylev 1, Y.S. Gulina 1,2

Bauman Moscow State Technical University, 105005, Moscow, Russia, 2-Ya Baumanskaya Ulitsa 5;
P.N. Lebedev Physical Institute of the Russian Academy of Sciences,
119991, Moscow, Russia, Leninskiy Prospekt 53

 PDF, 1346 kB

DOI: 10.18287/2412-6179-CO-1262

Pages: 761-769.

Full text of article: Russian language.

Abstract:
The article describes a methodology for performance evaluation of vision systems for remotely operated underwater vehicles. The methodology is based on a system approach and uses mathematical models of the aqueous medium where an optical signal propagates, the underwater object image registration system, and the mathematical model of the human visual system. The detection and recognition probabilities of underwater object image at a given registration range are used as performance evaluation indicators of underwater vision systems. The mathematical model of the aqueous medium developed by the authors allows quantitative evaluation of the influence of backscattering interference arising during objects illumination on the underwater vision system performance. The results of numerical experiments presented in the paper illustrate the possibility of using the proposed technique to optimize the underwater object image registration system parameters in order to achieve the required values of detection or recognition probabilities at the given ranges.

Keywords:
ROUV, underwater vision system, backscattering, optical imaging, water inherent optical properties, human visual system, detection and recognition probabilities.

Citation:
Kolyuchkin VY, Kostylev NM, Gulina YS. Performance evaluation of underwater vision systems. Computer Optics 2023; 47(5): 761-769. DOI: 10.18287/2412-6179-CO-1262.

References:

  1. Petillot YR, Antonelli G, Casalino G, Ferreira F. Underwater robots: From remotely operated vehicles to intervention-autonomous underwater vehicles. IEEE Robot Autom Mag 2019; 26(2): 94-101. DOI: 10.1109/MRA.2019.2908063.
  2. Ortiz A, Simó M, Oliver G. A vision system for an underwater cable tracker. Mach Vis Appl 2002; 13: 129-140. DOI: 10.1007/s001380100065.
  3. Zhang H, Zhang S, Wang Ya, Liu Yu, Yang Ya, Zhou T, Bian H. Subsea pipeline leak inspection by autonomous underwater vehicle. Appl Ocean Res 2021; 107: 102321. DOI: 10.1016/j.apor.2020.102321.
  4. Dumke I, Nornes SM, Purser A, Marcon Y, Ludvigsen M, Ellefmo SL, Johnsen G, Søreide F. First hyperspectral imaging survey of the deep seafloor: High-resolution mapping of manganese nodules. Remote Sens Environ 2018; 209: 19-30. DOI: 10.1016/j.rse.2018.02.024.
  5. Wu TC, Chi YC, Wang HY, et al. Blue laser diode enables underwater communication at 12.4 Gbps. Sci Rep 2017; 7: 40480. DOI: 10.1038/srep40480.
  6. Reynolds RA, Stramski D, Neukermans G. Optical backscattering by particles in Arctic seawater and relationships to particle mass concentration, size distribution, and bulk composition. Limnol Oceanogr 2016; 61: 1869-1890. DOI: 10.1002/lno.10341.
  7. Loisel H, Stramski D, Dessailly D, Jamet C, Li L, Reynolds RA. An inverse model for estimating the optical absorption and backscattering coefficients of seawater from remote-sensing reflectance over a broad range of oceanic and coastal marine environments. J Geophys Res Oceans 2018; 123: 2141-2171. DOI: 10.1002/2017JC013632.
  8. Mosyagin GM, Koluchkin VY. The theory of optical-electronic systems [In Russian]. Moscow: Publishing House of the Bauman Moscow State Technical University; 2020.
  9. McGlamery BL. A computer model for underwater camera systems. Proc SPIE 1980; 208: 221-231. DOI: 10.1117/12.958279.
  10. Jaffe JS. Computer modeling and the design of optimal underwater imaging systems. IEEE J Ocean Eng 1990; 15(2): 101-111.
  11. Shifrin KS. Introduction to ocean optics [In Russian]. Leningrad: “Gidrometeoizdat” Publisher; 1983.
  12. Levin IM. Promising lines of studying the ocean by optical remote sensing metods [In Russian]. Fundamental and Applied Hydrophysics 2008; 1: 14-47.
  13. Kozintcev VI, Orlov VM, Belov ML. Optical electronic systems for ecological monitoring of the nature environment [In Russian]. Moscow: Publishing House of the Bauman Moscow State Technical University; 2002.
  14. Karasik VE, Orlov VM. Location laser vision systems [In Russian]. Moscow: Publishing House of the Bauman Moscow State Technical University; 2013.
  15. Kostylev NM, Kolyuchkin VYa, Stepanov RO. A mathematical model of laser radiation propagation in seawater. Optics Spectrosc 2019; 127: 612-617. DOI: 10.1134/S0030400X1910014X.
  16. Lisenko AA, Shamanaev VS. Statistical estimates of the effect of the sea water scattering phase function on the characteristics of airborne hydrooptical lidar signals. Russian Physics Journal 2021; 64(2): 1373-1380. DOI: 10.1007/s11182-021-02463-7.
  17. Goodman JW. Introduction to Fourier optics. McGraw-Hill; 1996.
  18. Rizzini DL, Kallasi F, Aleotti J, Oleari J, Caselli S. Integration of a stereo vision system into an autonomous underwater vehicle for pipe manipulation tasks. Comput Electr Eng 2017; 58: 560-571. DOI: 10.1016/j.compeleceng.2016.08.023.
  19. Gulina YS, Koliuchkin VYa, Trofimov NE. Mathematical model of human visual system. Optical Memory & Neural Networks (Information Optics) 2018; 27(4): 219-234. DOI: 10.3103/S1060992X1804001X.
  20. Gulina YS, Kolyuchkin VYa. Experimental investigations of a Optics Spectrosc 2019; 127: 675-683. DOI: DOI:10.1134/S0030400X19100114.
  21. Gulina YS, Kolyuchkin VYa. Method for calculating detection probability of objects images by a human. Optical Memory and Neural Networks 2020; 29(3): 209-219. DOI: 10.3103/S1060992X2003011X.
  22. Gulina YS, Kolyuchkin VYa. Method for calculating recognition probability of objects images by a human. Optical Memory and Neural Networks 2021; 30(2): 172-179. DOI: 10.3103/S1060992X21020090.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20