RUS  ENG
Full version
JOURNALS // Intelligent systems. Theory and applications // Archive

Intelligent systems. Theory and applications, 2022 Volume 26, Issue 4, Pages 173–196 (Mi ista495)

This article is cited in 1 paper

Part 3. Mathematical models

Convex CPL-functions recovering by neural networks on RELU-bases

V. G. Shishlyakov

Lomonosov Moscow State University, Faculty of Mechanics and Mathematics

Abstract: The present paper considers a problem of functional classes obtained by using neural networks on max non-linearities bases. firstly, some properties of CPL-functions and equivalence classes generating them are investigated. Proceeding from these properties a theorem is proved that neural networks built on the basis of linear and max non-linearity functions can exactly recover any convex CPL-function. Secondly, RELU-basis, a special case of max non-linearities bases, is investigated, with a theorem similar to the previous one mentioned above proved. The question of estimating the number of neurons and layers in obtained architectures is also discussed. All the mentioned theorems have a constructive proof, i.e. neural network architectures with mentioned features are built explicitly.

Keywords: Neural networks, architecture, functions recovery, functions expressibility, convex functions, particle-linear functions, ReLU function, max function.



© Steklov Math. Inst. of RAS, 2024