SCALE INVARIANT STOCHASTIC GRADIENT METHOD WITH MOMENTUM
Journal
Математички билтен/BULLETIN MATHÉMATIQUE DE LA SOCIÉTÉ DES MATHÉMATICIENS DE LA RÉPUBLIQUE MACÉDOINE
Date Issued
2023
Author(s)
Nikolovski, Filip
DOI
10.37560/matbil23472147n
Abstract
Optimization in noisy environments arises frequently in applications. Solving this problem quickly, efficiently, and accurately is therefore of great importance. The stochastic gradient descent (SGD) method has proven to be a fundamental
and an effective tool which is flexible enough to allow modifications for improving its convergence properties. In this paper we propose a new algorithm for solving an unconstrained optimization problems in noisy environments which combines the SGD with a modified momentum term using a twopoint step size estimation in the Barzilai-Borwein (BB) framework. We perform a high probability analysis for the proposed algorithm and we establish its convergence under the standard assumptions. Numerical experiments demonstrate a promising behavior of the proposed method compared to the "vanilla" SGD with momentum in noise-free and in noisy environment when the objective function is scaled.
and an effective tool which is flexible enough to allow modifications for improving its convergence properties. In this paper we propose a new algorithm for solving an unconstrained optimization problems in noisy environments which combines the SGD with a modified momentum term using a twopoint step size estimation in the Barzilai-Borwein (BB) framework. We perform a high probability analysis for the proposed algorithm and we establish its convergence under the standard assumptions. Numerical experiments demonstrate a promising behavior of the proposed method compared to the "vanilla" SGD with momentum in noise-free and in noisy environment when the objective function is scaled.
