RUS  ENG
Full version
JOURNALS // Vestnik Tomskogo Gosudarstvennogo Universiteta. Matematika i Mekhanika // Archive

Vestn. Tomsk. Gos. Univ. Mat. Mekh., 2014 Number 3(29), Pages 5–19 (Mi vtgu388)

This article is cited in 3 papers

MATHEMATICS

The subgradient multistep minimization method for nonsmooth high-dimensional problems

V. N. Krutikov, Ya. N. Vershinin

Kemerovo State University, Kemerovo, Russian Federation

Abstract: In this paper, a new multistep relaxation subgradient minimization method is proposed. It is based on principles of organization of “conjugate subgradients” methods. The presented method belongs to the class of relaxation methods of the $\varepsilon$-subgradient type (RSM) and is intended for solving nonsmooth high-dimensional problems.
The space tension RSMs available at present are comparable in the rate of convergence for smooth functions with quasi-Newton methods and are efficient in solving nonsmooth problems of the ravine type. At high dimensional problems, it effectiveness is reduced due to the necessity of storage and transformation of the metric matrix. In the smooth case, the conjugate gradient method substitutes quasi-Newton methods at high-dimensional problems. Existing multistep RSMs are significantly inferior to the subgradient space tension methods in the rate of convergence and loop at ravine type nonsmooth optimization problems. That is why they are practically not applied for even for small dimension problems. These circumstances determine the importance of establishing effective multistage RSMs. In the considered relaxation subgradient method, additional learning relations are used at iterations with the aim to improve the efficiency of the learning algorithm for a known method based on extending the Kaczmarz algorithm to inequality systems. This innovation expands the range of solved nonsmooth optimization problems and increases the rate of convergence in solving smooth and non-smooth minimization problems.
Numerical results indicate an increase in the minimization method efficiency due to orthogonalization of learning vectors in the algorithm that solves the inequalities, which is particularly evident when solving nonsmooth problems of high dimensionality with a high degree of elongation of the level surfaces. According to the convergence properties at high dimension quadratic functions, at a large scatter of eigenvalues, the developed algorithm is superior to existing multi-step relaxation subgradient methods and is comparable in the effectiveness to the conjugate gradients method.

Keywords: Kaczmarz algortihm, multistep algorithm, minimization method, convergence rate.

UDC: 519.6

Received: 13.04.2013



© Steklov Math. Inst. of RAS, 2025