Abstract:
Various iterative algorithms for solving the linear equation $ax=b$ using a quantum computer operating on the principle of quantum annealing are studied. Assuming that the result produced by the computer is described by the Boltzmann distribution, conditions under which these algorithms converge are obtained and an estimate of their convergence rate is provided. Application of this approach for algorithms that use an infinite number of qubits and a small number of qubits is considered.
Key words:adiabatic quantum computations, quantum annealing, linear equation, Boltzmann distribution, truncated normal distribution.