RUS  ENG
Full version
JOURNALS // Diskretnyi Analiz i Issledovanie Operatsii // Archive

Diskretn. Anal. Issled. Oper., 2022 Volume 29, Issue 3, Pages 24–44 (Mi da1301)

Optimization of subgradient method parameters on the base of rank-two correction of metric matrices

V. N. Krutikova, P. S. Stanimirovićb, O. N. Indenkoa, E. M. Tovbisc, L. A. Kazakovtsevc

a Kemerovo State University, 6 Krasnaya Street, 650043 Kemerovo, Russia
b Faculty of Sciences and Mathematics, University of Niš, 33 Višegradska Street, 18000 Niš, Serbia
c Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskiy Rabochiy Avenue, 660031 Krasnoyarsk, Russia

Abstract: We establish a relaxation subgradient method (RSM) that includes parameter optimization utilizing metric rank-two correction matrices with a structure analogous to quasi-Newtonian (QN) methods. The metric matrix transformation consists of suppressing orthogonal and amplifying collinear components of the minimal length subgradient vector. The problem of constructing a metric matrix is formulated as a problem of solving an involved system of inequalities. Solving such system is based on a new learning algorithm. An estimate for its convergence rate is obtained depending on the parameters of the subgradient set. A new RSM has been developed and investigated on this basis. Computational experiments on complex large-scale functions confirm the effectiveness of the proposed algorithm. Tab. 4, bibliogr. 32.

Keywords: convex optimization, nonsmooth optimization, relaxation subgradient method.

UDC: 519.8

Received: 10.05.2022
Revised: 10.05.2022
Accepted: 12.05.2022

DOI: 10.33048/daio.2022.29.739



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024