Аннотация:We consider the problem of optimizing the strongly convex sum of a finite number of convex functions. Standard algorithms for solving this problem in the class of incremental/stochastic methods have at most a linear rate of convergence. We propose a new incremental method whose rate of convergence is superlinear—the Newton-type incremental method (NIM). The idea of the method is to introduce an overall quadratic model of the objective with the same sum-of-functions structure and further update a single component per iteration. We prove that NIM has a superlinear local rate of convergence and linear global rate of convergence. Experiments show that the method is very effective for problems with a large number of functions and a small number of variables.