RUS  ENG
Full version
JOURNALS // Computer Research and Modeling // Archive

Computer Research and Modeling, 2023 Volume 15, Issue 2, Pages 413–432 (Mi crm1068)

MATHEMATICAL MODELING AND NUMERICAL SIMULATION

Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods

F. S. Stonyakinab, O. S. Savchukab, I. V. Baranb, M. S. Alkousaac, A. A. Titova

a Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow region, 141701, Russia
b V. I. Vernadsky Crimean Federal University, 4 Academician Vernadsky Avenue, Simferopol, Republic of Crimea, 295007, Russia
c HSE University, 20 Myasnitskaya st., Moscow, 101000, Russia

Abstract: This paper is devoted to some variants of improving the convergence rate guarantees of the gradient-type algorithms for relatively smooth and relatively Lipschitz-continuous problems in the case of additional information about some analogues of the strong convexity of the objective function. We consider two classes of problems, namely, convex problems with a relative functional growth condition, and problems (generally, non-convex) with an analogue of the Polyak – Lojasiewicz gradient dominance condition with respect to Bregman divergence. For the first type of problems, we propose two restart schemes for the gradient type methods and justify theoretical estimates of the convergence of two algorithms with adaptively chosen parameters corresponding to the relative smoothness or Lipschitz property of the objective function. The first of these algorithms is simpler in terms of the stopping criterion from the iteration, but for this algorithm, the near-optimal computational guarantees are justified only on the class of relatively Lipschitz-continuous problems. The restart procedure of another algorithm, in its turn, allowed us to obtain more universal theoretical results. We proved a near-optimal estimate of the complexity on the class of convex relatively Lipschitz continuous problems with a functional growth condition. We also obtained linear convergence rate guarantees on the class of relatively smooth problems with a functional growth condition. For a class of problems with an analogue of the gradient dominance condition with respect to the Bregman divergence, estimates of the quality of the output solution were obtained using adaptively selected parameters. We also present the results of some computational experiments illustrating the performance of the methods for the second approach at the conclusion of the paper. As examples, we considered a linear inverse Poisson problem (minimizing the Kullback – Leibler divergence), its regularized version which allows guaranteeing a relative strong convexity of the objective function, as well as an example of a relatively smooth and relatively strongly convex problem. In particular, calculations show that a relatively strongly convex function may not satisfy the relative variant of the gradient dominance condition.

Keywords: relative strong convexity, relative smoothness, relative functional growth, adaptive method, restarts.

UDC: 519.85

Received: 19.02.2023
Accepted: 23.02.2023

DOI: 10.20537/2076-7633-2023-15-2-413-432



© Steklov Math. Inst. of RAS, 2024