Abstract:
The present paper considers a problem of functional classes obtained by using neural networks on max non-linearities bases. firstly, some properties of CPL-functions and equivalence classes generating them are investigated. Proceeding from these properties a theorem is proved that neural networks built on the basis of linear and max non-linearity functions can exactly recover any convex CPL-function.
Secondly, RELU-basis, a special case of max non-linearities bases, is investigated, with a theorem similar to the previous one mentioned above proved. The question of estimating the number of neurons and layers in obtained architectures is also discussed.
All the mentioned theorems have a constructive proof, i.e. neural network architectures with mentioned features are built explicitly.