Faculty of Mechanical Engineering
Permanent URI for this communityhttps://repository.ukim.mk/handle/20.500.12188/13
Browse
8 results
Search Results
- Some of the metrics are blocked by yourconsent settings
Item type:Publication, ON THE CONVERGENCE OF THE PROXIMAL GRADIENT METHOD WITH VARIABLE STEP SIZES(Union of Mathematicians of Macedonia, 2025) ;Nikolovski, FilipComposite optimization problems arise frequently in modeling, since the objective function might contain components that do not possess some “nice” properties like differentiability; the case of l1 (LASSO) regularization is one such example. The proximal gradient methods are designed to handle this kind of optimization problems, and can solve them efficiently when the proximal mapping has a closed-form solution. Theoretical analyses of the convergence properties of the proximal gradient method with constant step size have showed sublinear and linear convergence for convex and strongly convex objective functions respectively. In this paper we show that under standard assumptions the same kind of convergence result can be established for the proximal gradient method with variable step sizes in the general setting of bounded step sizes. Further, a recently proposed step size selection for the proximal gradient method with variable step sizes is considered, and the above convergence analysis is implemented for this method. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Gradient Descent Methods for Regularized Optimization(Macedonian Academy of Sciences and Arts, 2024) ;Nikolovski, Filip; ; Regularization is a widely recognized technique in mathematical optimization. It can be used to smooth out objective functions, refine the feasible solution set, or prevent overfitting in machine learning models. Due to its simplicity and robustness, the gradient descent (GD) method is one of the primary methods used for numerical optimization of differentiable objective functions. However, GD is not well-suited for solving l1 regularized optimization problems since these problems are non-differentiable at zero, causing iteration updates to oscillate or fail to converge. Instead, a more effective version of GD, called the proximal gradient descent employs a technique known as soft-thresholding to shrink the iteration updates toward zero, thus enabling sparsity in the solution. Motivated by the widespread applications of proximal GD in sparse and low-rank recovery across various engineering disciplines, we provide an overview of the GD and proximal GD methods for solving regularized optimization problems. Furthermore, this paper proposes a novel algorithm for the proximal GD method that incorporates a variable step size. Unlike conventional proximal GD, which uses a fixed step size based on the global Lipschitz constant, our method estimates the Lipschitz constant locally at each iteration and uses its reciprocal as the step size. This eliminates the need for a global Lipschitz constant, which can be impractical to compute. Numerical experiments we performed on synthetic and real-data sets show notable performance improvement of the proposed method compared to the conventional proximal GD with constant step size, both in terms of number of iterations and in time requirements. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, On the difference of the moduli of the two initial logarithmic coefficients(2025-03) ;Obradovic, Milutin - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Complex-step derivative approximation in noisy environment(Elsevier BV, 2018-01) ;Nikolovski, FilipThe complex-step derivative approximation is a powerful method for derivative approximations which has been successfully implemented in deterministic numerical algorithms. We explore and analyze its implementation in noisy environment through examples, error analysis and application to optimization methods. Numerical results show a promising performance of the complex-step gradient approximation in noisy environment. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, A nonmonotone line search method for noisy minimization(Springer Science and Business Media LLC, 2015-01-24) ;Krejić, Nataša ;Lužanin, Zorana ;Nikolovski, FilipA nonmonotone line search method for optimization in noisy environment is proposed. The method is defined for arbitrary search directions and uses only the noisy function values. Convergence of the proposed method is established under a set of standard assumptions. The computational issues are considered and the presented numerical results affirm that nonmonotone strategies are worth considering. Four different line search rules with three different directions are compared numerically. The influence of nonmonotonicity is discussed. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Complex-step derivative approximation in noisy environment(Elsevier BV, 2018-01) ;Nikolovski, Filip - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Gradient Descent Methods for Regularized Optimization(Macedonian Academy of Sciences and Arts, 2024) ;Nikolovski, Filip; ; - Some of the metrics are blocked by yourconsent settings
Item type:Publication, ON THE CONVERGENCE OF THE PROXIMAL GRADIENT METHOD WITH VARIABLE STEP SIZES(Union of Mathematicians of Macedonia, 2025) ;Nikolovski, Filip
