Markov Chains : Theory and Applications.
Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use,...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Wiley-ISTE,
2013.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Cover; Title Page; Contents; Preface; Chapter 1. Discrete-Time Markov Chains; 1.1. Definitions and properties; 1.2. Strong Markov property; 1.3. Recurrent and transient states; 1.4. State classification; 1.5. Visits to a state; 1.6. State space decomposition; 1.7. Irreducible and recurrent Markov chains; 1.8. Aperiodic Markov chains; 1.9. Convergence to equilibrium; 1.10. Ergodic theorem; 1.11. First passage times and number of visits; 1.11.1. First passage time to a state; 1.11.2. First passage time to a subset of states; 1.11.3. Expected number of visits; 1.12. Finite Markov chains.
- 1.13. Absorbing Markov chains1.14. Examples; 1.14.1. Two-state chain; 1.14.2. Gambler's ruin; 1.14.3. Success runs; 1.15. Bibliographical notes; Chapter 2. Continuous-Time Markov Chains; 2.1. Definitions and properties; 2.2. Transition functions and infinitesimal generator; 2.3. Kolmogorov's backward equation; 2.4. Kolmogorov's forward equation; 2.5. Existence and uniqueness of the solutions; 2.6. Recurrent and transient states; 2.7. State classification; 2.8. Explosion; 2.9. Irreducible and recurrent Markov chains; 2.10. Convergence to equilibrium; 2.11. Ergodic theorem.
- 2.12. First passage times2.12.1. First passage time to a state; 2.12.2. First passage time to a subset of states; 2.13. Absorbing Markov chains; 2.14. Bibliographical notes; Chapter 3. Birth-and-Death Processes; 3.1. Discrete-time birth-and-death processes; 3.2. Absorbing discrete-time birth-and-death processes; 3.2.1. Passage times and convergence to equilibrium; 3.2.2. Expected number of visits; 3.3. Periodic discrete-time birth-and-death processes; 3.4. Continuous-time pure birth processes; 3.5. Continuous-time birth-and-death processes; 3.5.1. Explosion; 3.5.2. Positive recurrence.
- 3.5.3. First passage time3.5.4. Explosive chain having an invariant probability; 3.5.5. Explosive chain without invariant probability; 3.5.6. Positive or null recurrent embedded chain; 3.6. Absorbing continuous-time birth-and-death processes; 3.6.1. Passage times and convergence to equilibrium; 3.6.2. Explosion; 3.7. Bibliographical notes; Chapter 4. Uniformization; 4.1. Introduction; 4.2. Banach spaces and algebra; 4.3. Infinite matrices and vectors; 4.4. Poisson process; 4.4.1. Order statistics; 4.4.2. Weighted Poisson distribution computation; 4.4.3. Truncation threshold computation.
- 4.5. Uniformizable Markov chains4.6. First passage time to a subset of states; 4.7. Finite Markov chains; 4.8. Transient regime; 4.8.1. State probabilities computation; 4.8.2. First passage time distribution computation; 4.8.3. Application to birth-and-death processes; 4.9. Bibliographical notes; Chapter 5. Queues; 5.1. The M/M/1 queue; 5.1.1. State probabilities; 5.1.2. Busy period distribution; 5.2. The M/M/c queue; 5.3. The M/M/∞ queue; 5.4. Phase-type distributions; 5.5. Markovian arrival processes; 5.5.1. Definition and transient regime.