RUS  ENG
Full version
JOURNALS // Preprints of the Keldysh Institute of Applied Mathematics // Archive

Keldysh Institute preprints, 2018 164, 26 pp. (Mi ipmp2523)

How to optimize preconditioners for the conjugate gradient method: a stochastic approach

I. V. Oseledets, M. A. Botchev, A. M. Katrutsa, G. V. Ovchinnikov


Abstract: The conjugate gradient method (CG) is usually used with a preconditioner which improves efficiency and robustness of the method. Many preconditioners include parameters and a proper choice of a preconditioner and its parameters is often not a trivial task. Although many convergence estimates exist which can be used for optimizing preconditioners, they typically hold for all initial guess vectors, reflecting the worst convergence rate. To account for the mean convergence rate instead, in this paper, we follow a simple stochastic approach. It is based on trial runs with random initial guess vectors and leads to a functional which can be used to monitor convergence and to optimize preconditioner parameters in CG. Presented numerical experiments show that optimization of this new functional usually yields a better parameter value than optimization of the functional based on the spectral condition number.

Keywords: conjugate gradient method, preconditioners, condition number, eigenvalue clustering, relaxed incomplete Cholesky preconditioner.

DOI: 10.20948/prepr-2018-164



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024