Abstract:
In this paper, we propose a fast iterative stochastic gradient method to solve convex optimization problem, referred as the RFISTA-SVRG-TR. We incorporate a restarting fast iteration mechanism into the inner loop, which promotes the convergence process of the algorithm. Furthermore, in order to shorten the oscillation period and enhance the stability of algorithm, each new iteration point is generated by solving the trust region subproblem. Under the condition of strong convexity, the proof of linear convergence and the overall complexity of algorithm are given. We show the superiority of the algorithm with suitable parameters in numerical experiments.