Abstract:
The paper considers the divergent decision forest method based on achieving a higher divergence in the forecast space compared to the standard random decision forest. It is based on including a new tree $T_x$ in the ensemble at each step, which is constructed based on minimizing
a special functional, which is the difference between the squared error of $T_x$ and the squared divergence of the forecasts $T_x$ and the current ensemble. The method is a development of similar previously developed methods that are intended for predicting numerical variables.
The paper presents the results of applying the divergent decision forest method to solving classification problems that arise when creating recommender systems. The paper investigates the dependence of the forecast efficiency on the tree depth and one of the key parameters of
the algorithm that regulates the contribution of two components to the minimized functional. Studies have shown that the accuracy of the proposed technology significantly exceeds the accuracy of the random decision forest and is close to the accuracy of the CatBoost method.