Cargando…

Markov Chains : Theory and Applications.

Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use,...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Sericola, Bruno
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Wiley-ISTE, 2013.
Temas:
Acceso en línea:Texto completo

MARC

LEADER 00000cam a2200000Ma 4500
001 EBOOKCENTRAL_ocn856625417
003 OCoLC
005 20240329122006.0
006 m o d
007 cr |n|||||||||
008 130823s2013 xx o 000 0 eng d
040 |a IDEBK  |b eng  |e pn  |c IDEBK  |d CDX  |d OCLCQ  |d EBLCP  |d MHW  |d AZU  |d OCLCO  |d OCLCF  |d OCLCQ  |d DEBSZ  |d DEBBG  |d OCLCQ  |d ZCU  |d MERUC  |d OCLCQ  |d ICG  |d AU@  |d OCLCQ  |d DKC  |d OCLCQ  |d OCLCO  |d OCLCQ  |d OCLCO 
020 |a 1299805175  |q (ebk) 
020 |a 9781299805170  |q (ebk) 
020 |a 9781118731444 
020 |a 1118731441 
029 1 |a AU@  |b 000052901295 
029 1 |a DEBBG  |b BV041910011 
029 1 |a DEBBG  |b BV044177907 
029 1 |a DEBSZ  |b 431523371 
035 |a (OCoLC)856625417 
037 |a 511768  |b MIL 
050 4 |a QA 
082 0 4 |a 519.233 
049 |a UAMI 
100 1 |a Sericola, Bruno. 
245 1 0 |a Markov Chains :  |b Theory and Applications. 
260 |b Wiley-ISTE,  |c 2013. 
300 |a 1 online resource 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
588 0 |a Print version record. 
520 |a Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomen. 
505 0 |a Cover; Title Page; Contents; Preface; Chapter 1. Discrete-Time Markov Chains; 1.1. Definitions and properties; 1.2. Strong Markov property; 1.3. Recurrent and transient states; 1.4. State classification; 1.5. Visits to a state; 1.6. State space decomposition; 1.7. Irreducible and recurrent Markov chains; 1.8. Aperiodic Markov chains; 1.9. Convergence to equilibrium; 1.10. Ergodic theorem; 1.11. First passage times and number of visits; 1.11.1. First passage time to a state; 1.11.2. First passage time to a subset of states; 1.11.3. Expected number of visits; 1.12. Finite Markov chains. 
505 8 |a 1.13. Absorbing Markov chains1.14. Examples; 1.14.1. Two-state chain; 1.14.2. Gambler's ruin; 1.14.3. Success runs; 1.15. Bibliographical notes; Chapter 2. Continuous-Time Markov Chains; 2.1. Definitions and properties; 2.2. Transition functions and infinitesimal generator; 2.3. Kolmogorov's backward equation; 2.4. Kolmogorov's forward equation; 2.5. Existence and uniqueness of the solutions; 2.6. Recurrent and transient states; 2.7. State classification; 2.8. Explosion; 2.9. Irreducible and recurrent Markov chains; 2.10. Convergence to equilibrium; 2.11. Ergodic theorem. 
505 8 |a 2.12. First passage times2.12.1. First passage time to a state; 2.12.2. First passage time to a subset of states; 2.13. Absorbing Markov chains; 2.14. Bibliographical notes; Chapter 3. Birth-and-Death Processes; 3.1. Discrete-time birth-and-death processes; 3.2. Absorbing discrete-time birth-and-death processes; 3.2.1. Passage times and convergence to equilibrium; 3.2.2. Expected number of visits; 3.3. Periodic discrete-time birth-and-death processes; 3.4. Continuous-time pure birth processes; 3.5. Continuous-time birth-and-death processes; 3.5.1. Explosion; 3.5.2. Positive recurrence. 
505 8 |a 3.5.3. First passage time3.5.4. Explosive chain having an invariant probability; 3.5.5. Explosive chain without invariant probability; 3.5.6. Positive or null recurrent embedded chain; 3.6. Absorbing continuous-time birth-and-death processes; 3.6.1. Passage times and convergence to equilibrium; 3.6.2. Explosion; 3.7. Bibliographical notes; Chapter 4. Uniformization; 4.1. Introduction; 4.2. Banach spaces and algebra; 4.3. Infinite matrices and vectors; 4.4. Poisson process; 4.4.1. Order statistics; 4.4.2. Weighted Poisson distribution computation; 4.4.3. Truncation threshold computation. 
505 8 |a 4.5. Uniformizable Markov chains4.6. First passage time to a subset of states; 4.7. Finite Markov chains; 4.8. Transient regime; 4.8.1. State probabilities computation; 4.8.2. First passage time distribution computation; 4.8.3. Application to birth-and-death processes; 4.9. Bibliographical notes; Chapter 5. Queues; 5.1. The M/M/1 queue; 5.1.1. State probabilities; 5.1.2. Busy period distribution; 5.2. The M/M/c queue; 5.3. The M/M/∞ queue; 5.4. Phase-type distributions; 5.5. Markovian arrival processes; 5.5.1. Definition and transient regime. 
590 |a ProQuest Ebook Central  |b Ebook Central Academic Complete 
650 0 |a Markov processes. 
650 6 |a Processus de Markov. 
650 7 |a Markov processes  |2 fast 
776 0 8 |i Print version:  |z 9781299805170 
856 4 0 |u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=1441764  |z Texto completo 
938 |a Coutts Information Services  |b COUT  |n 26034272 
938 |a EBL - Ebook Library  |b EBLB  |n EBL1441764 
938 |a ProQuest MyiLibrary Digital eBook Collection  |b IDEB  |n cis26034272 
994 |a 92  |b IZTAP