Abstract:
This paper is devoted to a particular case of applying universal accelerated proximal methods for constructing computationally efficient accelerated versions of methods used for solving optimization problems in various specific statements. A proximally accelerated componentwise gradient method with efficient algorithmic complexity of each iteration is proposed, which effectively takes into account the problem sparseness. An example of applying the proposed approach to solving the optimization problem for a function of form SoftMax is considered. In this problem, the method weakens the dependence of the computational complexity of solution on the problem size n by a factor of $\mathcal{O}\sqrt{n}$, and in practice it demonstrates a faster convergence compared with conventional methods.