RUS  ENG
Full version
JOURNALS // Program Systems: Theory and Applications // Archive

Program Systems: Theory and Applications, 2011 Volume 2, Issue 1, Pages 63–70 (Mi ps28)

This article is cited in 1 paper

Optimization Methods and Control Theory

Sufficient conditions of optimality for optimal control problems of logic-dynamic systems

N. S. Maltuguevaab

a Institute for System Dynamics and Control Theory SB RAS, Irkutsk
b Program System Institute named by A. K. Ailamazyan of Russian Academy of Sciences, Pereslavl-Zalessky

Abstract: This article deals with logic-dynamic systems, it's a special class of discrete-continuous control systems. Discrete component in these systems is an integer/valued function, which has a finite number of discontinuity points. The optimal control problem is formulated for this kind of systems. The problem under consideration differs from the classical optimal control problem that the right-hand sides of differential equations and functional have the discrete variables. In articles of A.S. Bortakovskii sufficient conditions of optimality are proved for the Bellman function. But this theorem is true for any function Krotov, and the author of this work showed this. Also in the article it's described an approach to the construction of computational procedures for this problem.

Key words and phrases: control systems, nonlocal improvement.

UDC: 517.977

Received: 13.02.2011
Accepted: 11.03.2011



© Steklov Math. Inst. of RAS, 2025