RUS  ENG
Full version
SEMINARS

PreMoLab Seminar
February 5, 2014 17:00, Moscow, A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences (Bol'shoi Karetnyi per., 19), room 615


Convergent subgradient methods for nonsmooth convex minimization

Yu. E. Nesterov

Université Catholique de Louvain


http://www.youtube.com/watch?v=YucrsxdwgTY

Abstract: In this talk, we present new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The methods are derived from a relaxed estimating sequences condition, and ensure reconstruction of an approximate primal-dual optimal solution. Our methods are applicable as efficient real-time stabilization tools for potential systems with infinite horizon. As an example, we consider a model of privacy-respecting taxation, where the center has no information on the utility functions of the agents. Nevertheless, by a proper taxation policy, the agents can be forced to apply in average the socially optimal strategies. Preliminary numerical experiments confirm a high efficiency of the new methods.


© Steklov Math. Inst. of RAS, 2024