Optimal design of control systems Stochastic and deterministic problems
This groundbreaking reference/text covers design methods for optimal (or quasioptimal) control algorithms in the form of synthesis for deterministic and stochastic dynamical systems--with applications to biological, radio engineering, mechanical, and servomechanical technologies
Saved in:
| Main Author: | |
|---|---|
| Format: | Book |
| Language: | English |
| Published: |
New York
Marcel Dekker Inc.
1999
|
| Series: | Monographs and textbooks in pure and applied mathematics
221 |
| Subjects: | |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Table of Contents:
- Synthesis problems for control systems and the dynamic programming approach
- Exact methods for synthesis problems
- Approxiamate synthesis of stochastic control systems with small control actions
- Synthesis of quasioptimal systems in the case of small diffusion terms in the Bellman equation
- Control of oscillatory systems
- Some special applications of asymptotic synthesis methods
- Numerical synthesis methods


