RUS  ENG
Full version
JOURNALS // Trudy Instituta Matematiki i Mekhaniki UrO RAN // Archive

Trudy Inst. Mat. i Mekh. UrO RAN, 1998 Volume 5, Pages 301–318 (Mi timm482)

This article is cited in 7 papers

Mathematical theory of optimal control and differential games

Hamilton–Jacobi–Bellman equation for a nonlinear impulse control problem

A. V. Stefanova


Abstract: A minimum problem for a functional of Bolza type along trajectories of nonlinear systems of differential equations governed by impulse controls with integral constraints is considered. A definition of a solution to such systems uses the closure of the set of absolutely continuous trajectories in the topology of pointwise convergence. It is shown that the value function of such a system is Lipschitz continuous and is a unique viscosity solution to a partial first order differential equation (a Hamilton–Jacobi–Bellman equation). Boundary conditions satisfied by the solution are obtained.

UDC: 517.977.54, 517.518.24

MSC: 49C20, 49E15

Received: 08.09.1997



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024