RUS  ENG
Full version
JOURNALS // Bulletin of Irkutsk State University. Series Mathematics // Archive

Bulletin of Irkutsk State University. Series Mathematics, 2023 Volume 43, Pages 110–121 (Mi iigum519)

Algebraic and logical methods in computer science and artificial intelligence

On the properties of bias-variance decomposition for kNN regression

Victor M. Nedel'ko

Sobolev Institute of Mathematics SB RAS, Novosibirsk, Russian Federation

Abstract: When choosing the optimal complexity of the method for constructing decision functions, an important tool is the decomposition of the quality criterion into bias and variance.
It is generally assumed (and in practice this is most often true) that with increasing complexity of the method, the bias component monotonically decreases, and the variance component increases. The conducted research shows that in some cases this behavior is violated.
In this paper, we obtain an expression for the variance component for the kNN method for the linear regression problem in the formulation when the “explanatory” features are random variables. In contrast to the well-known result obtained for non-random “explanatory” variables, in the considered case, the variance may increase with the growth of $k$.

Keywords: bias-variance decomposition, machine learning, $k$-nearest neighbors algorithm, overfitting.

UDC: 519.246

MSC: 68T10, 62H30

Received: 05.12.2022
Revised: 16.01.2023
Accepted: 23.01.2023

Language: English

DOI: 10.26516/1997-7670.2023.43.110



© Steklov Math. Inst. of RAS, 2024