Activation functions’ impact on regularization of deep neural networks application in atomic simulations
Date Issued
2022
Author(s)
Sandjakoska, Ljubinka
Madevska Bogdanova, Ana
Abstract
When it comes to atomic simulations, the
regularization of the deep neural networks is key to its
successful application. The generalization capability of deep
network depends on some factors. This paper aims to show that
activation function is one of the most important factors that
influence to decreasing the generalization error. For that
purpose, several experiments were performed. Moreover, new
approach for choosing the activation function is proposed. The
purpose of the activation mechanism is not to find a universal,
new activation function, although this is not excluded, but the
most appropriate for the given task and for the given data set.
The obtained results show that using the proposed activation
approach, a decreasing of the mean absolute error compared to
the benchmark set is achieved.
regularization of the deep neural networks is key to its
successful application. The generalization capability of deep
network depends on some factors. This paper aims to show that
activation function is one of the most important factors that
influence to decreasing the generalization error. For that
purpose, several experiments were performed. Moreover, new
approach for choosing the activation function is proposed. The
purpose of the activation mechanism is not to find a universal,
new activation function, although this is not excluded, but the
most appropriate for the given task and for the given data set.
The obtained results show that using the proposed activation
approach, a decreasing of the mean absolute error compared to
the benchmark set is achieved.
Subjects
File(s)![Thumbnail Image]()
Loading...
Name
CIIT_2022_19.pdf
Size
329.85 KB
Format
Adobe PDF
Checksum
(MD5):d2d0e8d36e54e858819872899e86a744
