RUS  ENG
Full version
JOURNALS // Teoreticheskaya i Matematicheskaya Fizika // Archive

TMF, 2023 Volume 214, Number 3, Pages 517–528 (Mi tmf10418)

This article is cited in 1 paper

Machine learning of the well-known things

V. V. Dolotinabc, A. Yu. Morozovabc, A. V. Popolitovabc

a Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
b Alikhanov Institute for Theoretical and Experimental Physics, National Research Centre "Kurchatov Institute", Noscow, Russia
c Kharkevich Institute for Information Transmission Problems of the Russian Academy of Sciences, Moscow, Russia

Abstract: Machine learning (ML) in its current form implies that the answer to any problem can be well approximated by a function of a very peculiar form: a specially adjusted iteration of Heaviside theta-functions. It is natural to ask whether the answers to questions that we already know can be naturally represented in this form. We provide elementary and yet nonevident examples showing that this is indeed possible, and suggest to look for a systematic reformulation of existing knowledge in an ML-consistent way. The success or failure of these attempts can shed light on a variety of problems, both scientific and epistemological.

Keywords: exact approaches to QFT, nonlinear algebra, machine learning, steepest descent method.

Received: 02.12.2022
Revised: 06.12.2022

DOI: 10.4213/tmf10418


 English version:
Theoretical and Mathematical Physics, 2023, 214:3, 446–455

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024