Repository logo
Communities & Collections
Research Outputs
Fundings & Projects
People
Statistics
User Manual
Have you forgotten your password?
  1. Home
  2. Faculty of Natural Sciences and Mathematics
  3. Institute of Mathematics
  4. Faculty of Natural Sciences and Mathematics, Institute of Mathematics: Journal Articles
  5. SCALE INVARIANT STOCHASTIC GRADIENT METHOD WITH MOMENTUM
Details

SCALE INVARIANT STOCHASTIC GRADIENT METHOD WITH MOMENTUM

Journal
Математички билтен/BULLETIN MATHÉMATIQUE DE LA SOCIÉTÉ DES MATHÉMATICIENS DE LA RÉPUBLIQUE MACÉDOINE
Date Issued
2023
Author(s)
Nikolovski, Filip
DOI
10.37560/matbil23472147n
Abstract
Optimization in noisy environments arises frequently in applications. Solving this problem quickly, efficiently, and accurately is therefore of great importance. The stochastic gradient descent (SGD) method has proven to be a fundamental and an effective tool which is flexible enough to allow modifications for improving its convergence properties. In this paper we propose a new algorithm for solving an unconstrained optimization problems in noisy environments which combines the SGD with a modified momentum term using a twopoint step size estimation in the Barzilai-Borwein (BB) framework. We perform a high probability analysis for the proposed algorithm and we establish its convergence under the standard assumptions. Numerical experiments demonstrate a promising behavior of the proposed method compared to the "vanilla" SGD with momentum in noise-free and in noisy environment when the objective function is scaled.
Subjects

numerical optimizatio...

File(s)
Loading...
Thumbnail Image
Name

mat-bilten-nikolovski-stojkovska-2023-47-no2 (2).pdf

Description
Journal Article
Size

224.38 KB

Format

Adobe PDF

Checksum

(MD5):8a59f6b6548b1a4b713f6dbf3a07b045

⠀

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Accessibility settings
  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify