Evaluation of Sigmoid and ReLU Activation Functions -+ Using Asymptotic Method

[featured_image]
Download
Download is available until [expire_date]
  • Version
  • Download 3
  • File Size 2.05 MB
  • File Count 1
  • Create Date February 14, 2023
  • Last Updated February 14, 2023

Evaluation of Sigmoid and ReLU Activation Functions -+ Using Asymptotic Method

Abstract

The performance of two algorithms may be compared using an asymptotic technique in algorithm analysis, which focuses largely on the growth rate as the number of inputs grows. Sigmoid activation and ReLU activation functions are widely employed in ANNs (Yingying, 2020), and each has advantages and disadvantages that should be considered when designing ANN solutions for a given issue. This study aimed to compare the performance of sigmoid activation and ReLU activation function during training using an asymptotic approach. The work focuses on training time complexity as the basis of comparison of the two activation functions using an asymptotic approach. The result derived from this study showed that sigmoid activation function takes more computation time in performing forward path, loss computation and backward propagation than ReLU activation functions. The computation cost will become significant when dealing with deep neural networks with hundreds to thousands of neurons. Overall, the training time for ReLU based Neural network will be better than that of sigmoid based one. Sigmoid have higher computational cost compared to ReLU but the two algorithms have a linear growth rate.

Keywords: Back propagation, Loss computation, Sigmoid Activation, ReLU activation, ANNs

SHARE