RUS  ENG
Full version
JOURNALS // Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia // Archive

Dokl. RAN. Math. Inf. Proc. Upr., 2025 Volume 527, Pages 485–494 (Mi danma703)

SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES

Towards foundation time series model: to synthesize or not to synthesize?

A. A. Temirkhanovab, A. M. Kostrominaab, O. A. Tsymboiac, K. A. Kuvshinovaad, E. Yu. Kovtuna, D. E. Simakova

a Sber AI Lab, Moscow, Russia
b National Research University Higher School of Economics, Moscow
c Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region
d Skolkovo Institute of Science and Technology

Abstract: The industry is rich in cases when we are required to make forecasting for large amounts of time series at once. However, we might be in a situation where we can not afford to train a separate model for each of them. Such issue in time series modeling remains without due attention. The remedy for this setting is the establishment of a foundation model expected to forecast in zero-shot and few-shot regimes. In this work, we consider the essential question if it is advantageous to train a foundation model on synthetic data or it is better to utilize only a limited number of real-life examples. Our experiments are conducted only for regular time series and speak in favor of leveraging solely the real time series. Moreover, the choice of the proper source dataset strongly influences the performance during inference. When provided access even to a limited quantity of short time series data, employing it within a supervised framework yields more favorable results than training on a larger volume of synthetic data.

Keywords: time series forecasting, foundation models, synthetic data, zero-shot, few-shot, transfer learning, Fourier seasonality, trend modeling.

UDC: 517.54

Received: 21.08.2025
Accepted: 22.09.2025

DOI: 10.7868/S2686954325070410



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025