Theoretical foundations of hypertrace-transform: scanning techniques, mathematical apparatus and experimental verification
Fedotov N.G., Syemov A.A., Moiseev A.V.

Penza State University, Penza, Russia,

LLC «KomHelf», Penza, Russia

Abstract:
We consistently describe the theoretical basis of a new geometric method of analysis and recognition of three-dimensional (3D) images. The description of a scanning technique for forming a hypertrace transform and its mathematical model are given. This method, unlike the existing ones, enables 3D images to be analyzed directly from their 3D shape, without first simplifying them or constructing plane projections. We substantiate the selection of a particular scanning tool and the need to construct a reference spherical grid to address the problem of the rotational invariance of the 3D image recognition. A mathematical apparatus of the stochastic realization of the scanning technique based on stochastic geometry and functional analysis is developed. We introduce a new mathematical tool for 3D image analysis – a hypertrex matrix that allows spatial objects of complex shape and structure to be recognized by constructing a single mathematical model of the 3D image. We describe a new type of 3D image features that have an analytic structure -- hypertryplet features, whose analytical structure makes possible an automatic generation of a large number of features with predetermined properties. Results of the experimental verification are presented, demonstrating the accurate calculation of features for 3D image recognition and proving the adequacy of the developed mathematical apparatus.

Keywords:
recognition of 3D images, geometric hypertrace-transform, grid of parallel planes, stochastic scanning, analytical structure of the feature, hypertrace matrix, and invariant recognition.

Citation:
Fedotov NG, Syemov AA, Moiseev AV. Theoretical foundations of hypertrace-transform: scanning techniques, mathematical apparatus and experimental verification. Computer Optics 2018; 42(2): 273-282. DOI: 10.18287/2412-6179-2018-42-2-273-282.

References:

  1. Kiy KI. Segmentation and detection of contrast objects and their application in robot navigation. Pattern Recognition and Image Analysis 2015; 25(2): 338-346. DOI: 10.1134/S1054661815020145.
  2. Wang C, Huang K-Q. VFM: visual feedback model for robust object recognition. Journal of Computer Science and Technology 2015; 30(2): 325-339. DOI: 10.1007/s11390-015-1526-1.
  3. Gaidel AV, Pervushkin SS. Research of the textural features for the bony tissue diseases diagnostics using the roentgenograms [In Russian]. Computer Optics 2013; 37(1): 113-119.
  4. Gaidel AV, Zelter PM, Kapishnikov AV, Khramov AG. Computed tomography texture analysis capabilities in diagnosing a chronic obstructive pulmonary disease [In Russian]. Computer Optics 2014; 38(4): 843-850.
  5. Fedotov NG, Semov AA, Moiseev AV. 3D-trace-conversion: scanning modes, stochastic implementation features, methods of computational speedup [In Russian]. University proceedings. Volga region. Technical sciences 2014; 3: 41-53.
  6. Rakhmanov EA, Saff EB, Zhou YM. Minimal discrete energy on the sphere. Math Res Lett 1994; 1(6): 647-662. DOI: 10.4310/MRL.1994.v1.n6.a3.
  7. Lovisolo L, da Silva LEAB. Uniform distribution of points on a hyper-sphere with applications to vector bit-plane encoding. IEE Proc – Vis Image Signal Process 2001; 148(3): 187-193. DOI: 10.1049/ip-vis:20010361.
  8. Fedotov NG. The theory of pattern recognition features based on stochastic geometry and functional analysis [In Russian]. Moscow: “Fizmatlit” Publisher; 2010. ISBN: 978-5-9221-0996-3.
  9. Fedotov NG, Shul'ga LA, Moiseev AV. Random scanning for speedier systems of pattern recognition based on stochastic geometry methods. Pattern Recognition and Image Analysis 2005; 15(2): 387-388.
  10. Princeton Shape Benchmark. Source: <http://shape.cs.princeton.edu/benchmark/>.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: ko@smr.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20