Abstract:
We consider a zero-sum differential game on a finite interval, in which the players not only control the system's trajectory but also influence the terminal time of the game. It is assumed that the early terminal time is an absolutely continuous random variable, and its density is given by bounded measurable functions of time assigned by both players (the intensities of the influence of each player on the termination of the game). The payoff function may depend both on the terminal time of the game together with the position of the system at this time and on the player who initiates the termination. The strategies are formalized by using nonanticipating càdlàg processes. The existence of the game value is shown under the Isaacs condition. For this, the original game is approximated by an auxiliary game based on a continuous-time Markov chain, which depends on the controls and intensities of the players. Based on the strategies optimal in this Markov game, a control procedure with a stochastic guide is proposed for the original game. It is shown that, under an unlimited increase in the number of points in the Markov game, this procedure leads to a near-optimal strategy in the original game.