Abstract:
This paper considers an optimal control problem for a time-invariant linear stochastic
system with discrete time, scalar unbounded control, additive noise, and a probabilistic criterion
for retaining its trajectories in a given neighborhood of zero. We use dynamic programming and
two-sided Bellman function estimates to derive analytical expressions for the optimal control
at two time steps and a suboptimal control on any control horizon. The effectiveness of these
controls is illustrated on a numerical example.