RUS  ENG
Full version
JOURNALS // Vestnik Udmurtskogo Universiteta. Matematika. Mekhanika. Komp'yuternye Nauki // Archive

Vestn. Udmurtsk. Univ. Mat. Mekh. Komp. Nauki, 2018 Volume 28, Issue 2, Pages 260–274 (Mi vuu637)

COMPUTER SCIENCE

Neural networks with dynamical coefficients and adjustable connections on the basis of integrated backpropagation

M. N. Nazarov

National Research University of Electronic Technology, pl. Shokina, 1, Zelenograd, Moscow, 124498, Russia

Abstract: We consider artificial neurons which will update their weight coefficients with an internal rule based on backpropagation, rather than using it as an external training procedure. To achieve this we include the backpropagation error estimate as a separate entity in all the neuron models and perform its exchange along the synaptic connections. In addition to this we add some special type of neurons with reference inputs, which will serve as a base source of error estimates for the whole network. Finally, we introduce a training control signal for all the neurons, which can enable the correction of weights and the exchange of error estimates. For recurrent neural networks we also demonstrate how to integrate backpropagation through time into their formalism with the help of some stack memory for reference inputs and external data inputs of neurons. Also, for widely used neural networks, such as long short-term memory, radial basis function networks, multilayer perceptrons and convolutional neural networks, we demonstrate their alternative description within the framework of our new formalism. As a useful consequence, our approach enables us to introduce neural networks with the adjustment of synaptic connections, tied to the integrated backpropagation.

Keywords: artificial neurons, backpropagation, adaptive connection adjustment, recurrent neural networks.

UDC: 519.68, 007.5

MSC: 68T05, 62M86

Received: 22.05.2018

Language: English

DOI: 10.20537/vm180212



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025