RUS  ENG
Полная версия
СЕМИНАРЫ

Общероссийский семинар по оптимизации им. Б.Т. Поляка
1 декабря 2021 г. 17:30, Москва, Онлайн, пятница, 19:00


Generalized Newton Methods via Variational Analysis

Б. Ш. Мордухович


https://youtu.be/bBtpzf-mN78

Аннотация: In this talk we present two novel globally convergent Newton-type methods to solve unconstrained and constrained problems of nonsmooth optimization by using tools of variational analysis and generalized differentiation. Both methods are coderivative-based and employ generalized Hessians (coderivatives of subgradient mappings) associated with problems of convex composite optimization, where one of the terms may be extended-real-valued. The proposed globally convergent algorithms are of two types. The first one extends the damped Newton method and requires positive-definiteness of the generalized Hessians for its well-posedness and efficient performance, while the other algorithm is of the Levenberg-Marquardt type being well-defined when the generalized Hessians are merely positive-semidefinite. The obtained convergence rates for both methods are at least linear, but becomes superlinear under the so-called semismooth* property of subgradient mappings. Problems of convex composite optimization are investigated with and without the strong convexity assumption on of smooth parts of objective functions by implementing the machinery of forward-backward envelopes. Numerical experiments are conducted for a basic class of Lasso problems by providing performance comparisons of the new algorithms with some other first-order and second-order methods that are highly recognized in nonsmooth optimization.


© МИАН, 2024