RUS  ENG
Full version
JOURNALS // Informatsionnye Tekhnologii i Vychslitel'nye Sistemy // Archive

Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2023 Issue 3, Pages 46–54 (Mi itvs820)

INTELLIGENT SYSTEMS AND TECHNOLOGIES

Layer-wise knowledge distillation for simplified bipolar morphological neural networks

M. V. Zingerenkoab, E. E. Limonovabc

a Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
b Smart Engines Service LLC, Moscow, Russia
c Federal Research Center "Computer Science and Control" of Russian Academy of Sciences, Moscow, Russia

Abstract: Various neuron approximations can be used to reduce the computational complexity of neural networks. One such approximation based on summation and maximum operations is a bipolar morphological neuron. This paper presents an improved structure of the bipolar morphological neuron that enhances its computational efficiency and a new approach to training based on continuous approximations of the maximum and knowledge distillation. Experiments were conducted on the MNIST dataset using a LeNet-like neural network architecture and on the CIFAR10 dataset using a ResNet-22 model architecture. The proposed training method achieves 99.45% classification accuracy on the LeNet-like model, with the same accuracy of the classical network, and 86.69% accuracy on the ResNet-22 model, compared to 86.43% accuracy of the classical model. The results show that the proposed method with logsum-exp (LSE) approximation of the maximum and layer-by-layer knowledge distillation, allows for a simplified bipolar morphological network that is not inferior to classical networks.

Keywords: bipolar morphological networks, approximations, artificial neural networks, computational efficiency.

DOI: 10.14357/20718632230305



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024