(48-2) 18 * << * >> * Russian * English * Content * All Issues
  
Transverse-layer partitioning of artificial neural networks for image classification
 N.A. Vershkov 1, M.G. Babenko 1, N.N. Kuchukova 1, V.A. Kuchukov 1, N.N. Kucherov 1
 1 North-Caucasus Center for Mathematical Research, North Caucasus Federal University,
  355017, Russia, Stavropol, st. Pushkin 1
 PDF, 4260 kB
  PDF, 4260 kB
DOI: 10.18287/2412-6179-CO-1278
Pages: 312-320.
Full text of article: Russian language.
 
Abstract:
We discuss issues of  modular learning in artificial neural networks and explore possibilities of the  partial use of modules when the computational resources are limited. The  proposed method is based on the ability of a wavelet transform to separate  information into high- and low-frequency parts. Using the expertise gained in  developing convolutional wavelet neural networks, the authors perform a  transverse-layer partitioning of the network into modules for the further  partial use on devices with low computational capability. The theoretical  justification of this approach in the paper is supported by experimentally  dividing the MNIST database into 2 and 4 modules before using them sequentially  and measuring the respective accuracy and performance. When using the individual  modules, a two-fold (or higher) performance gain is achieved. The theoretical  statements are verified using an AlexNet-like network on the GTSRB dataset,  with a performance gain of 33% per module with no loss of accuracy.
Keywords:
wavelet transform,  artificial neural networks, convolutional layer, orthogonal transforms, modular  learning, neural network optimization.
Citation:
  Vershkov NA, Babenko MG, Kuchukova NN, Kuchukov VA, Kucherov NN. Transverse-layer partitioning of artificial neural networks for image classification. Computer Optics 2024; 48(2): 312-320. DOI: 10.18287/2412-6179-CO-1278.
Acknowledgements:
  The research was  financially supported by the Russian Science Foundation under grant No.  22-71-10046, https://rscf.ru/en/project/22-71-10046/.
References:
  - Kussul ME. A modular representation of neural  networks [In Russian]. Mathematical Machines and Systems 2006; 4: 51-62.
 
- Rykov VP. The modular principle of artificial  neural network training using known neural network topologies as an example [In  Russian]. Bulletin of Tambov State University 2014; 19(2): 583-586.
 
- Andreas  J, Rohrbach M, Darrell T, Klein D. Neural module networks. Proc IEEE Conf on  Computer Vision and Pattern Recognition (CVPR) 2016: 39-48.
 
- Auda  G, Kamel M, Raafat H. Modular neural network architectures for classification.  Proc Int Conf on Neural Networks (ICNN’96) 1996; 2: 1279-1284. DOI:  10.1109/ICNN.1996.549082.
 
- Auda  G, Kamel M, Raafat H. Voting schemes for cooperative neural network  classifiers. IEEE Conf on Neural Networks (ICNN'95) 1995; 3: 1240-1243.
 
- Lu  Z, Pu H, Wang F, Hu Z, Wang L. The expressive power of neural networks: A view  from the width. 31st Conf on Neural Information Processing Systems (NIPS 2017)  2017: 6232-6240.
 
- Kidger  P, Lyons T. Universal approximation with deep narrow networks. 33rd Annual Conf  on Learning Theory 2020: 1-22.
 
- Kim  JS, Cho Y, Lim TH. Prediction of locations in medical images using orthogonal  neural networks. Eur J Radiol Open 2021; 8: 100388.
 
- Jamal  A, Ashour M, Helmi R, Fong S. A wavelet–neural networks model for time series.  11th IEEE Symposium on Computer Applications Industrial Electronics (ISCAIE)  2021: 325-330. DOI: 10.1109/ISCAIE51753.2021.9431777.
 
- D’Amario  V, Sasaki T, Boix X. How modular should neural module networks be for  systematic generalization? arXiv Preprint. 2022. Source:  <arXiv:2106.08170v2>.
 
- Smolencev  NK. Basics of wavelet theory. Wavelets in MATLAB [In Russian]. Moscow:  "DMK Press" Publisher; 2019. ISBN: 5-94074-415-X.
 
- Ahmed  N, Rao KR. Orthogonal transfarms for digital signal processing.  Springer-Verlag; 1975.
 
- McCulloch  WS, Pitts W. A logical calculus of the ideas immanent in nervous activity. Bull  Math Biophys 1943; 5(4): 115-133.
 
- Vershkov  NA, Kuchukov VA, Kuchukova NN, Babenko M. The wave model of artificial neural  network. Proc 2020 IEEE Conf of Russian Young Researchers in Electrical and  Electronic Engineering (EIConRus) 2020: 542-547.
 
- Vershkov  N, Babenko M, Tchernykh A, et al. Optimization of artificial neural networks  using wavelet transforms. Program Comput Soft 2022; 48: 376-384. DOI:  10.1134/S036176882206007X.
 
- Haar  A. Zur theorie der orthogonalen funktionensysteme. Gottingen: Georg-August  Universitat; 1909.
 
- PyTorch.  Source: <https://pytorch.org/>.
 
- PyWavelets.  Source: <https://pypi.org/project/PyWavelets/>.
 
- Qiao  Y. THE MNIST DATABASE of handwritten digits. 2007. Source:  <http://www.gavo.t.u-tokyo.ac.jp/~qiao/database.html>.
 
- GTSRB  - German traffic sign recognition benchmark. 2023. Source:    <https://www.kaggle.com/datasets/meowmeowmeowmeowmeow/gtsrb-german-traffic-sign>.
 
- Ushakov  YA, Polezhaev PN, Shukhman AE, Ushakova MV. Distribution of the neural network  between mobile device and cloud infrastructure services [In Russian]. Modern  Information Technology and IT-education 2018; 14(4): 903-910. DOI:  10.25559/SITITO.14.201804.903-910.
 
- Rytov  SM, Kravtsov YuA, Tatarsky VI. Introduction to statistical radiophysics. Part  2. Random fields [In Russian]. Moscow: "Nauka" Publisher; 1978.
 
- Minkin  AS, Nikolaeva OV, Russkov AA. Hyperspectral data compression based upon the  principal component analysis. Computer Optics 2021; 45(2): 235-244. DOI:  10.18287/2412-6179-CO-806. 
- Zenkov IV, Lapko AV, Lapko VA, Kiryushina EV, Vokin VN, Bakhtina AV. A  method of sequentially generating a set of components of a multidimensional  random variable using a nonparametric pattern recognition algorithm. Computer  Optics 2021; 45(6): 926-933. DOI: 10.18287/2412-6179-CO-902.
  
  © 2009, IPSI RAS
  151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20