Abstract:
Mysovskikh' theorem about Newton method of solving a nonlinear equation in Banach space using an estimate of initial approximation error demands stronger restriction of some characteristic parameter than in Mysovskikh' theorem about simplified Newton method. As the latter method uses less information on each step than the basic one, i.e. a value of derivative on the regarded function in initial approach instead of the one in each current approach, two theorems form a paradox. It was not clear if it was a “nature of things” or the first theorem was not enough strong. It appeared in a scalar case that the restriction on the charateristic parameter sufficient for convergence can be weakened so that the paradox disappears. It is also shown that the new restriction cannot be weakened. The results are valid as for original assumption of the theorem and for its developed version where the maximum value of the second derivative of the considered function is replaced by Lipschitz' constant of the first derivative. Bibliogr. 3.