(24) * << * >> * Russian * English * Content * All Issues

Accelerating the neural network training with the selection of samples

V.A. Shustov1,2
1 Image Processing Systems Institute of RAS;

 2 Samara State Aerospace University

 PDF, 227kB

Pages: 160 - 163.

Abstract:
The article investigates the possibility of increasing the efficiency of training a neural network that recognizes digit images. The permissible deviation of neurons of the last layer from the desired position is used. The training is performed by the method of backpropagation on incorrectly classified data only. The authors substantiate the possibility of efficient parallelization on cluster computing systems.

Keywords:
neural network, digit image, method of backpropagation, parallelization cluster computing system.


Citation:
Shustov VA. Accelerating the neural network training with the selection of samples. Computer Optics 2002; 24: 160 - 163.

References:

  1. Gorban AN, Rossiev DA. Neural networks on a personal computer [In Russian]. Novosibirsk: "Nauka" Publisher; 1996.
  2. Wasserman PD. Neural computing: Theory and practice. New York: Van Nostrand Reinhold Co; 1989.
  3. Gorban AN, Dunin-Barkovsky VL, Kirdin AN, Mirkes EM, Novohod'ko AYu, Rossiev DA, Terekhov SA, Senashova MYu, Tsaregorodtsev VG. Neuroinformatics [In Russian]. Novosibirsk: "Nauka" Publisher; 1998.
  4. Voronovsky GK, Makhotilo KV, Petrashev SN, Sergeev SA. Genetic algorithms, artificial neural networks and virtual reality problems [In Russian]. Kharkiv: "Osnova" Publisher; 1997.
  5. Kruglov VV, Borisov VV. Artificial neural networks: theory and practice [In Russian]. Moscow: "Goryachaya Liniya Telekom" Publisher; 2001.
  6. Golovashkin DL. Parallel computing methods (Part I): Handbook [In Russian]. Samara: SSAU Publisher; 2002.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20