RUS  ENG
Full version
JOURNALS // Vestnik Sankt-Peterburgskogo Universiteta. Seriya 10. Prikladnaya Matematika. Informatika. Protsessy Upravleniya // Archive

Vestnik S.-Petersburg Univ. Ser. 10. Prikl. Mat. Inform. Prots. Upr., 2016 Issue 4, Pages 66–74 (Mi vspui311)

This article is cited in 2 papers

Computer science

About an algorithm for consistent weights initialization of deep neural networks and neural networks ensemble learning

I. S. Drokin

St. Petersburg State University, 7–9, Universitetskaya nab., St. Petersburg, 199034, Russian Federation

Abstract: Using of the pretraining of multilayer perceptrons mechanism has greatly improved the quality and speed of training deep networks. In this paper we propose another way of the weights initialization using the principles of supervised learning, self-taught learning approach and transfer learning, tests showing performance approach have been carried out and further steps and directions for the development of the method presented have been suggested. In this paper we propose an iterative algorithm of weights initialization based on the rectification of the hidden layers of weights of the neural network through the resolution of the original problem of classification or regression, as well as the method for constructing a neural network ensemble that naturally results from the proposed learning algorithm, tests showing performance approach have been carried out. Refs 14. Figs 5. Tables 2.

Keywords: deep learning, neural networks weights initialization, ensemble of neural networks.

UDC: 519.688

Received: May 19, 2016
Accepted: September 29, 2016

DOI: 10.21638/11701/spbu10.2016.406



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024