Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.12188/30395
DC FieldValueLanguage
dc.contributor.authorVodilovska, Viktorijaen_US
dc.contributor.authorGievska, Sonjaen_US
dc.contributor.authorIvanoska, Ilinkaen_US
dc.date.accessioned2024-06-05T08:48:14Z-
dc.date.available2024-06-05T08:48:14Z-
dc.date.issued2023-05-22-
dc.identifier.urihttp://hdl.handle.net/20.500.12188/30395-
dc.description.abstractGraph Neural Networks (GNN) emerged as increasingly attractive deep learning models for complex data, making them extremely useful in biochemical and pharmaceutical domains. However, building a good-performing GNN requires lots of parameter choices and Hyperparameter optimization (HPO) can aid in exploring solutions. This study presents a comparative analysis of several strategies for Hyperparameter optimization of GNNs. The explored optimization techniques include complex algorithms such as the bio-inspired Genetic Algorithm, Particle Swarm Optimization, and Artificial Bee Colony. In addition, Hill Climb and Simulated Annealing as well as the commonly used methods Random Search and Bayesian Search have also been covered. The proposed optimization algorithms have been evaluated on improving the performance of the GNN architectures developed for predicting mRNA degradation. The Stanford OpenVaccine dataset for mRNA degradation prediction has been used for training and testing the predictive models. Finding mRNA molecules with low degradation rates is important in development of mRNA vaccines for diseases such as COVID-19 and we hope to benefit research on ML in this domain. According to the analysis’s findings, Simulated Annealing algorithm outperforms other algorithms on both architectures. Furthermore, population based algorithms like Particle Swarm optimization show promising results, with certain limitations related to the complexity of the algorithms which encourages further exploration of the subject.en_US
dc.publisherIEEEen_US
dc.subjectHyperparameter Optimization , Random Search , Bayesian search , Hill Climbing , Simulated Annealing , Genetic Algorithm , Artificial Bee Colony , Particle Swarm Optimization , GCN , GAT , mRNA degradation , mRNA vaccinesen_US
dc.titleHyperparameter Optimization of Graph Neural Networks for mRNA Degradation Predictionen_US
dc.typeProceeding articleen_US
dc.relation.conference2023 46th MIPRO ICT and Electronics Convention (MIPRO)en_US
item.grantfulltextnone-
item.fulltextNo Fulltext-
crisitem.author.deptFaculty of Computer Science and Engineering-
crisitem.author.deptFaculty of Computer Science and Engineering-
Appears in Collections:Faculty of Computer Science and Engineering: Conference papers
Show simple item record

Page view(s)

23
checked on Sep 22, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.