RUS  ENG
Full version
JOURNALS // Computer Research and Modeling // Archive

Computer Research and Modeling, 2023 Volume 15, Issue 2, Pages 393–412 (Mi crm1067)

This article is cited in 1 paper

MATHEMATICAL MODELING AND NUMERICAL SIMULATION

Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum

F. S. Stonyakinab, S. S. Ablaevab, I. V. Baranb, M. S. Alkousaac

a Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow region, 141701, Russia
b V. I. Vernadsky Crimean Federal University, 4 Academician Vernadsky Avenue, Simferopol, Republic of Crimea, 295007, Russia
c HSE University, 20 Myasnitskaya st., Moscow, 101000, Russia

Abstract: The work is devoted to the study of subgradient methods with different variations of the Polyak stepsize for minimization functions from the class of weakly convex and relatively weakly convex functions that have the corresponding analogue of a sharp minimum. It turns out that, under certain assumptions about the starting point, such an approach can make it possible to justify the convergence of the subgradient method with the speed of a geometric progression. For the subgradient method with the Polyak stepsize, a refined estimate for the rate of convergence is proved for minimization problems for weakly convex functions with a sharp minimum. The feature of this estimate is an additional consideration of the decrease of the distance from the current point of the method to the set of solutions with the increase in the number of iterations. The results of numerical experiments for the phase reconstruction problem (which is weakly convex and has a sharp minimum) are presented, demonstrating the effectiveness of the proposed approach to estimating the rate of convergence compared to the known one. Next, we propose a variation of the subgradient method with switching over productive and non-productive steps for weakly convex problems with inequality constraints and obtain the corresponding analog of the result on convergence with the rate of geometric progression. For the subgradient method with the corresponding variation of the Polyak stepsize on the class of relatively Lipschitz and relatively weakly convex functions with a relative analogue of a sharp minimum, it was obtained conditions that guarantee the convergence of such a subgradient method at the rate of a geometric progression. Finally, a theoretical result is obtained that describes the influence of the error of the information about the (sub)gradient available by the subgradient method and the objective function on the estimation of the quality of the obtained approximate solution. It is proved that for a sufficiently small error $\delta>0$, one can guarantee that the accuracy of the solution is comparable to $\delta$.

Keywords: subgradient method, sharp minimum, Lipschitz-continuous function, relatively Lipschitz-continuous function, relatively sharp minimum, phase retrieval problem.

UDC: 519.85

Received: 19.02.2023
Accepted: 23.02.2023

DOI: 10.20537/2076-7633-2023-15-2-393-412



© Steklov Math. Inst. of RAS, 2024