Аннотация:
Bipolar morphological neural networks are aimed at efficient hardware implementation without multiplications inside the convolutional layers. However, they use resource demanding activation functions based on binary logarithm and exponent. In this paper, the computationally efficient approximations for activation functions of bipolar morphological neural networks are considered. Mitchell's approximation is used for binary logarithm and demonstrates 12 times decrease in the estimated logic gate number and latency. Schraudolph's approximation used for exponent has 3 times lower logic gates complexity and latency. The usage of approximate activation functions provides a 12–40% latency decrease for the BM convolutional layers with a small number of input channels and 3 $\times$ 3 filters compared to standard ones. The experiments show that these approximations can be used in the BM ResNet trained for classification task with a reasonable recognition accuracy decreasing from 99.08% to 98.90%.