1. Calculus of Variations.- 2. Minimum Principle.- 3. Dynamic Programming.- 4. Linear Quadratic Control.- 5. Glimpses of Related Topics.- A. Background Material.- B. Differential Equations and Lyapunov Functions.- Solutions to Odd Numbered Exercises.- Bibliography.- Index.
Gjerrit Meinsma received his graduate degree in Applied Mathematics from the University of Twente (Enschede, the Netherlands) in 1989. Four years later, at the same university, he obtained his Ph.D. degree. His postdoctoral work was done at the University of Newcastle with the Electrical Engineering Department. In 1997 he returned to the University of Twente where he now holds a position of senior lecturer. His research interests are in mathematical systems and control theory, and mathematics in general. He has written around 75 papers. In the last 10 years he focuses more on teaching.
Arjan van der Schaft received his undergraduate and Ph.D. degrees in Mathematics from the University of Groningen, the Netherlands, in 1979 and 1983, respectively. In 1982 he joined the Department of Applied Mathematics, University of Twente, Enschede, where he was appointed as a full professor in Mathematical Systems and Control Theory in 2000. In 2005 he returned to Groningen as a full professor in Mathematics. He is Fellow of the Institute of Electrical and Electronics Engineers (IEEE) and Fellow of the International Federation of Automatic Control (IFAC), and invited semi-plenary speaker at the International Congress of Mathematicians (2006). His research interests include mathematical systems theory, nonlinear control, geometric modeling of multi-physics systems, and hybrid systems. His present focus is on network modeling and control of physical systems and their formulation as port-Hamiltonian systems, and applications in complex engineering systems, power networks, and systems biology.
This text provides a detailed and self-contained introduction to the core topics of optimal control for finite-dimensional deterministic dynamical systems. Skillfully designed to guide the student through the development of the subject, the book provides a rich collection of examples, exercises, illustrations, and applications, to support comprehension of the material. Solutions to odd-numbered exercises are included, while a complete set of solutions is available to instructors who adopt the text for their class. The book is adaptable to coursework for final year undergraduates in (applied) mathematics or beginning graduate students in engineering. Required mathematical background includes calculus, linear algebra, a basic knowledge of differential equations, as well as a rudimentary acquaintance with control systems.
The book has developed out of lecture notes that were tested, adapted, and expanded over many years of teaching. Chapters 1-4 constitute the material for a basic course on optimal control, covering successively the calculus of variations, minimum principle, dynamic programming, and linear quadratic control. The additional Chapter 5 provides brief views to a number of selected topics related to optimal control, which are meant to peak the reader’s interest. Some mathematical background is summarized in Appendix A for easy review. Appendix B recalls some of the basics of differential equations and also provides a detailed treatment of Lyapunov stability theory including LaSalle’s invariance principle, as occasionally used in Chapters 3 and 4.