(44-3) 16 * << * >> * Russian * English * Content * All Issues

Modification of blurred image matching method
R.A. Paringer 1,2, Y. Donon 1,2, A.V. Kupriyanov 1,2

Samara National Research University, 443086, Samara, Russia, Moskovskoye Shosse 34,
IPSI RAS – Branch of the FSRC "Crystallography and Photonics" RAS,
443001, Samara, Russia, Molodogvardeyskaya 151

 PDF, 869 kB

DOI: 10.18287/2412-6179-CO-712

Pages: 441-445.

Full text of article: Russian language.

Abstract:
The article proposes a modification of the Blurred Image Matching (BIM) method, a key point selection method in images, thus solving a problem of their accurate comparison. A new approach for blobs selection and comparison is proposed. The use of these modifications allows us to achieve an increase in the proportion of correctly matched pairs of images by 30.2% compared to the basic method when working with noisy data.

Keywords:
image matching, key points, feature extraction, algorithms, linear embedding.

Citation:
Paringer RA, Donon Y, Kupriyanov AV. Modification of blurred image matching method. Computer Optics 2020; 44(3): 441-445. DOI: 10.18287/2412-6179-CO-712.

Acknowledgements:
The research was supported by the Ministry of Science and Higher Education of the Russian Federation (Grant # 0777-2020-0017) and partially funded by RFBR, project number # 19-29-01135.

References:

  1. Smith SW. The scientist and engineer's guide to digital signal processing. 2nd ed. San Diego, California: California Technical Publishing; 1999.
  2. Goshin YeV. Estimating intrinsic camera parameters using the sum of cosine distances. J Phys Conf Ser 2018; 1096: 012092. DOI: 10.1088/1742-6596/1096/1/012092.
  3. Shapiro L, Stockman G. Computer vision. Seattle, Washington: Prentice Hall; 2001.
  4. Gomez S. Shape recognition using machine learning. 6th Colombian Computing Congress (CCC) 2011.
  5. Ngo T-T, Hajime N, Ko N, Rin-ichiro T, Yasushi Y. Reflectance and shape estimation with a light field camera under natural illumination. Int J Comp Vis 2019: 1-16.
  6. Nitin T, Vikas K. A review: Image edge unmasking by applying renovated and colony optimization technique. International Journal of Advanced Research in Computer and Communication Engineering 2015; 4(6): 35-38.
  7. Urbančič T, Fras M, Stopar B, Božo K. The influence of the input parameters selection on the RANSAC results. International Journal of Simulation Modelling 2014; 13(2): 159-170.
  8. Donon Y, Paringer R, Kupriyanov A, Goshin Y. Blur-robust image registration and stitching. J Phys Conf Ser 2019; 1368(5): 052043. DOI: 10.1088/1742-6596/1368/5/052043.
  9. Bradski, G. The OpenCV library. Dr. Dobb's Journal: Software Tools for The Professional Programmer 2000. Source: <https://www.drdobbs.com/open-source/the-opencv-library/184404319>.
  10. Soifer VA, ed. Computer image processing, Part II: Methods and algorithms. Saarbrücken: VDM Verlag Dr. Müller; 2010. ISBN: 978-3-639-17545-5.
  11. Canny J. A computational approach to edge detection. IEEE Trans Pattern Anal Mach Intell 1986; 8(6): 679-698. DOI: 10.1109/TPAMI.1986.4767851.
  12. Teh CH, Chin RT. On the detection of dominant points on digital curve. IEEE Trans Pattern Anal Mach Intell 1989; 11(8): 859-872.
  13. Sivanesan U, Braga L, Sonnadara R, Dhindsa K. Unsupervised medical image segmentation with adversarial networks: From edge diagrams to segmentation maps. Source: <https://arxiv.org/abs/1911.05140>.
  14. Chen C, Lu W, Chou C. Rotational copy-move forgery detection using SIFT and region growing strategies. Multimed Tools Appl 2019; 78(13): 18293-18308.
  15. Hu, M. Visual pattern recognition by moment invariants. IRE Trans Inform Theory 1962; 8(2): 179-187.
  16. DroneMapper. Source: <https://dronemapper.com/sample_data/>.
  17. Bhattacharyya A. On a measure of divergence between two statistical populations defined by their probability distributions. Bulletin of the Calcutta Mathematical Society 1943; 35: 99-109.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: ko@smr.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20