Ве молиме користете го овој идентификатор да го цитирате или поврзете овој запис: http://hdl.handle.net/20.500.12188/25693
DC FieldValueLanguage
dc.contributor.authorSandjakoska, Ljubinkaen_US
dc.contributor.authorMadevska Bogdanova, Anaen_US
dc.date.accessioned2023-02-13T11:10:42Z-
dc.date.available2023-02-13T11:10:42Z-
dc.date.issued2022-
dc.identifier.urihttp://hdl.handle.net/20.500.12188/25693-
dc.description.abstractWhen it comes to atomic simulations, the regularization of the deep neural networks is key to its successful application. The generalization capability of deep network depends on some factors. This paper aims to show that activation function is one of the most important factors that influence to decreasing the generalization error. For that purpose, several experiments were performed. Moreover, new approach for choosing the activation function is proposed. The purpose of the activation mechanism is not to find a universal, new activation function, although this is not excluded, but the most appropriate for the given task and for the given data set. The obtained results show that using the proposed activation approach, a decreasing of the mean absolute error compared to the benchmark set is achieved.en_US
dc.subjectactivation, regularization, deep neural networks, atomic simulationsen_US
dc.titleActivation functions’ impact on regularization of deep neural networks application in atomic simulationsen_US
dc.typeProceedingsen_US
dc.relation.conferenceThe 19th International Conference on Informatics and Information Technologies – CIIT 2022en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:Faculty of Computer Science and Engineering: Conference papers
Files in This Item:
File Опис SizeFormat 
CIIT_2022_19.pdf329.85 kBAdobe PDFView/Open
Прикажи едноставен запис

Page view(s)

88
checked on 4.5.2025

Download(s)

23
checked on 4.5.2025

Google ScholarTM

Проверете


Записите во DSpace се заштитени со авторски права, со сите права задржани, освен ако не е поинаку наведено.