Jump to ratings and reviews
Rate this book

Optimal Control: Linear Quadratic Methods

Rate this book
This augmented edition of a respected text teaches the reader how to use linear quadratic Gaussian methods effectively for the design of control systems. It explores linear optimal control theory from an engineering viewpoint, with step-by-step explanations that show clearly how to make practical use of the material.
The three-part treatment begins with the basic theory of the linear regulator/tracker for time-invariant and time-varying systems. The Hamilton-Jacobi equation is introduced using the Principle of Optimality, and the infinite-time problem is considered. The second part outlines the engineering properties of the regulator. Topics include degree of stability, phase and gain margin, tolerance of time delay, effect of nonlinearities, asymptotic properties, and various sensitivity problems. The third section explores state estimation and robust controller design using state-estimate feedback.
Numerous examples emphasize the issues related to consistent and accurate system design. Key topics include loop-recovery techniques, frequency shaping, and controller reduction, for both scalar and multivariable systems. Self-contained appendixes cover matrix theory, linear systems, the Pontryagin minimum principle, Lyapunov stability, and the Riccati equation. Newly added to this Dover edition is a complete solutions manual for the problems appearing at the conclusion of each section.

464 pages, Paperback

First published January 1, 1990

1 person is currently reading
7 people want to read

About the author

Brian D.O. Anderson

7 books2 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
0 (0%)
4 stars
6 (85%)
3 stars
1 (14%)
2 stars
0 (0%)
1 star
0 (0%)
No one has reviewed this book yet.

Can't find what you're looking for?

Get help and learn more about the design.