- Preface.- Part I: Introduction.- Subject of optimal control.- Mathematical model for controlled object.- Part II: Control of Linear Systems.- Reachability set.- Controllability of linear systems.- Minimum time problem.- Synthesis of optimal system performance.- The observability problem.- Identification problem.- Part III: Control of Nonlinear Systems.- Types of optimal control problems.- Small increments of a trajectory.- The simplest problem of optimal control.- General optimal control problem.- Problems with intermediate states.- Extremals field theory.- Sufficient optimality conditions.- Conclusion.- Appendix.- Examples of tasks and solution.- Bibliography.
Leonid T. Ashchepkov is a Professor in the Department of Mathematics of the Institute of Mathematics and Computer Technologies at Far Eastern Federal University, Vladivostok, Russi
Dmitriy V. Dolgy is a Professor in the Kwangwoon Glocal Education Center at Kwangwoon University, Seoul, Korea & in the Department of Mathematics of the Institute of Mathematics and Computer Technologies at Far Eastern Federal University, Vladivostok, Russia
Taekyun Kim is a Professor in the Department of Mathematics at the College of Natural Science at Kwangwoon University, South Korea.
Ravi P. Agarwal is a Professor and the chair of the Department of Mathematics at Texas A&M University.
This textbook, now in its second edition, results from lectures, practical problems, and workshops on Optimal Control, given by the authors at Irkutsk State University, Far Eastern Federal University (both in Vladivostok, Russia), and Kwangwoon University (Seoul, South Korea).
In this work, the authors cover the theory of linear and nonlinear systems, touching on the basic problem of establishing the necessary and sufficient conditions of optimal processes. Readers will find two new chapters, with results of potential interest to researchers with a focus on the theory of optimal control, as well as to those interested in applications in Engineering and related sciences. In addition, several improvements have been made through the text.
This book is structured in three parts. Part I starts with a gentle introduction to the basic concepts in Optimal Control. In Part II, the theory of linear control systems is constructed on the basis of the separation theorem and the concept of a reachability set. The authors prove the closure of reachability set in the class of piecewise continuous controls and touch on the problems of controllability, observability, identification, performance, and terminal control. Part III, in its turn, is devoted to nonlinear control systems. Using the method of variations and the Lagrange multipliers rule of nonlinear problems, the authors prove the Pontryagin maximum principle for problems with mobile ends of trajectories.
Problem sets at the end of chapters and a list of additional tasks, provided in the appendix, are offered for students seeking to master the subject. The exercises have been chosen not only as a way to assimilate the theory but also as to induct the application of such knowledge in more advanced problems.