RUS  ENG
Full version
JOURNALS // Matematicheskii Sbornik // Archive

Mat. Sb., 2023 Volume 214, Number 4, Pages 38–75 (Mi sm9791)

This article is cited in 2 papers

Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs

Dinh Dũng

Information Technology Institute, Vietnam National University, Hanoi, Vietnam

Abstract: We find the convergence rates of the collocation approximation by deep ReLU neural networks of solutions to elliptic PDEs with lognormal inputs, parametrized by $\boldsymbol{y}$ in the noncompact set ${\mathbb R}^\infty$. The approximation error is measured in the norm of the Bochner space $L_2({\mathbb R}^\infty, V, \gamma)$, where $\gamma$ is the infinite tensor-product standard Gaussian probability measure on ${\mathbb R}^\infty$ and $V$ is the energy space. We also obtain similar dimension-independent results in the case when the lognormal inputs are parametrized by ${\mathbb R}^M$ of very large dimension $M$, and the approximation error is measured in the $\sqrt{g_M}$-weighted uniform norm of the Bochner space $L_\infty^{\sqrt{g}}({\mathbb R}^M, V)$, where $g_M$ is the density function of the standard Gaussian probability measure on ${\mathbb R}^M$.
Bibliography: 62 titles.

Keywords: high-dimensional approximation, collocation approximation, deep ReLU neural networks, parametric elliptic PDEs, lognormal inputs.

MSC: 65C30, 65D05, 65D32, 65N15, 65N30, 65N35

Received: 09.05.2022 and 15.12.2022

DOI: 10.4213/sm9791


 English version:
Sbornik: Mathematics, 2023, 214:4, 479–515

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025