Dynamic programming and its application to optimal control /
In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrang...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Otros Autores: | , |
Formato: | Electrónico eBook |
Idioma: | Inglés Francés |
Publicado: |
New York :
Academic Press,
1971.
|
Colección: | Mathematics in science and engineering ;
v. 81. |
Temas: | |
Acceso en línea: | Texto completo Texto completo Texto completo |
Tabla de Contenidos:
- Pt. I. Discrete deterministic processes. The principles of dynamic programming
- Processes with bounded horizon
- Processes with infinite or unspecified horizon
- Practical solution of the optimal recurrence relation
- pt. II. Discrete random processes. General theory
- Processes with discrete states
- pt. III. Numerical synthesis of the optimal controller for a linear process. General discussion of the problem
- Numerical optimal control of a measurable deterministic process
- Numerical optimal control of a stochastic process
- pt. IV. Continuous processes. Continuous deterministic processes
- Continuous stochastic processes
- pt. V. Applications. Introductory example
- Minimum use of control effort in a first-order system
- Optimal tabulation of functions
- Regulation of angular position with minimization of a quadratic criterion
- Control of a stochastic system
- Minimum-time depth change of a submersible vehicle
- Optimal interception
- Control of a continuous process
- Filtering.