Skip to content

Optimal Control Theory An Introduction

Best in textbook rentals since 2012!

ISBN-10: 0486434842

ISBN-13: 9780486434841

Edition: 2004

Authors: Donald E. Kirk

List price: $35.00
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin’s minimum principle, and numerical techniques for trajectory optimization. The text focuses first on describing systems and evaluating their performances; then, it treats dynamic programming followed by the calculus of variations and Pontryagin’s minimum principle. Finally, there is an examination of iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, which introduce additional topics and illustrate basic concepts, appear throughout the text. 1970 ed. 131 figures. 14 tables. Index. Solution guide available upon…    
Customers also bought

Book details

List price: $35.00
Copyright year: 2004
Publisher: Dover Publications, Incorporated
Publication date: 4/30/2004
Binding: Paperback
Pages: 480
Size: 5.35" wide x 8.46" long x 0.98" tall
Weight: 1.034
Language: English

Describing the System and Evaluating Its Performance
Introduction
Problem Formulation
State Variable Representation of Systems
Concluding Remarks
References
Problems
The Performance Measure
Performance Measures for Optimal Control Problems
Selecting a Performance Measure
Selection of a Performance Measure: The Carrier Landing of a Jet Aircraft
References
Problems
Dynamic Programming
Dynamic Programming
The Optimal Control Law
The Principle of Optimality
Application of the Principle of Optimality to Decision-Making
Dynamic Programming Applied to a Routing Problem
An Optimal Control System
Interpolation
A Recurrence Relation of Dynamic Programming
Computational Procedure for Solving Control Problems
Characteristics of Dynamic Programming Solution
Analytical Results--Discrete Linear Regulator Problems
The Hamilton-Jacobi-Bellman Equation
Continuous Linear Regulator Problems
The Hamilton-Jacobi-Bellman Equation--Some Observations
Summary
References
Problems
The Calculus of Variations and Pontryagin's Minimum Principle
The Calculus of Variations
Fundamental Concepts
Functionals of a Single Function
Functionals Involving Several Independent Functions
Piecewise-Smooth Extremals
Constrained Extrema
Summary
References
Problems
The Variational Approach to Optimal Control Problems
Necessary Conditions for Optimal Control
Linear Regulator Problems
Pontryagin's Minimum Principle and State Inequality Constraints
Minimum-Time Problems
Minimum Control-Effort Problems
Singular Intervals in Optimal Control Problems
Summary and Conclusions
References
Problems
Iterative Numerical Techniques for Finding Optimal Controls and Trajectories
Numerical Determination of Optimal Trajectories
Two-Point Boundary-Value Problems
The Method of Steepest Descent
Variation of Extremals
Quasilinearization
Summary of Iterative Techniques for Solving Two-Point Boundary-Value Problems
Gradient Projection
References
Problems
Conclusion
Summation
The Relationship Between Dynamic Programming and the Minimum Principle
Summary
Controller Design
Conclusion
References
Appendices
Useful Matrix Properties and Definitions
Difference Equation Representation of Linear Sampled-Data Systems
Special Types of Euler Equations
Answers to Selected Problems
Index