Please use this identifier to cite or link to this item:
Title: Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes
Authors: Lužanin, Zorana
Stojkovska, Irena 
Kresoja, Milena
Keywords: Unconstrained optimization
Stochastic optimization
Stochastic approximation
Noisy function
Adaptive step size
Descent direction
Linear regression model
Issue Date: 1-Jun-2019
Publisher: Global Science Press
Project: Ministry of Education, Science and Technology Development of Serbia grant No. 174030 and Ss. Cyril and Methodius University of Skopje, Macedonia scientific research projects for 2014/2015 academic year
Journal: Journal of Computational Mathematics
Abstract: A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to 1/k when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.
DOI: 10.4208/jcm.1710-m2017-0021
Appears in Collections:Faculty of Natural Sciences and Mathematics: Journal Articles

Show full item record

Page view(s)

checked on Jun 2, 2020

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.