RUS  ENG
Full version
JOURNALS // Avtomatika i Telemekhanika // Archive

Avtomat. i Telemekh., 2022 Issue 10, Pages 67–79 (Mi at16052)

Topical issue

Gradient methods for optimizing metaparameters in the knowledge distillation problem

M. Gorpinicha, O. Yu. Bakhteevb, V. V. Strijovb

a Moscow Institute of Physics and Technology, Dolgoprudnyi, Moscow oblast, 141701 Russia
b Dorodnicyn Computing Centre, Russian Academy of Sciences, Moscow, 119333 Russia

Abstract: The paper investigates the distillation problem for deep learning models. Knowledge distillation is a metaparameter optimization problem in which information from a model of a more complex structure, called a teacher model, is transferred to a model of a simpler structure, called a student model. The paper proposes a generalization of the distillation problem for the case of optimization of metaparameters by gradient methods. Metaparameters are the parameters of the distillation optimization problem. The loss function for such a problem is the sum of the classification term and the cross-entropy between the responses of the student model and the teacher model. Assigning optimal metaparameters to the distillation loss function is a computationally difficult task. The properties of the optimization problem are investigated so as to predict the metaparameter update trajectory. An analysis of the trajectory of the gradient optimization of metaparameters is carried out, and their value is predicted using linear functions. The proposed approach is illustrated using a computational experiment on CIFAR-10 and Fashion-MNIST samples as well as on synthetic data.

Keywords: machine learning, knowledge distillation, metaparameter optimization, gradient optimization, metaparameter assignment.

Presented by the member of Editorial Board: A. A. Lazarev

Received: 17.02.2022
Revised: 23.06.2022
Accepted: 29.06.2022

DOI: 10.31857/S0005231022100075


 English version:
Automation and Remote Control, 2022, 83:10, 1544–1554

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024