Аннотация:
In this talk we consider a primal-dual method for solving nonsmooth constrained optimization problem with functional constraints. This method consists in alternating updates of primal and dual variables, such that one of them can be seen as a coordinate descent scheme. Nevertheless, it has best possible performance guarantees. We show that such a method can be applied to sparse problems of very big size, ensuring the logarithmic dependence of iteration complexity in the problem’s dimension.