Probability, Random Processes, and Statistical Analysis : Applications to Communications, Signal Processing, Queueing Theory and Mathematical Finance.
Covers the fundamental topics together with advanced theories, including the EM algorithm, hidden Markov models, and queueing and loss systems.
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Otros Autores: | , |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Cambridge :
Cambridge University Press,
2011.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Cover; Probability, Random Processes, and Statistical Analysis; Title; Copyright; Contents; Abbreviations and Acronyms; Preface; Organization of the book; Suggested course plans; Supplementary materials; Solution manuals; Lecture slides; Matlab exercises and programs; Acknowledgments; 1 Introduction; 1.1 Why study probability, random processes, and statistical analysis?; 1.1.1 Communications, information, and control systems; 1.1.2 Signal processing; 1.1.3 Machine learning; 1.1.4 Biostatistics, bioinformatics, and related fields; 1.1.5 Econometrics and mathematical finance.
- 1.1.6 Queueing and loss systems1.1.7 Other application domains; 1.2 History and overview; 1.2.1 Classical probability theory; 1.2.2 Modern probability theory; 1.2.3 Random processes; 1.2.3.1 Poisson process to Markov process; 1.2.3.2 Brownian motion to Itô process; 1.2.4 Statistical analysis and inference; 1.2.4.1 Frequentist statistics versus Bayesian statistics; 1.3 Discussion and further reading; Part I Probability, random variables, and statistics; 2 Probability; 2.1 Randomness in the real world; 2.1.1 Repeated experiments and statistical regularity.
- 2.1.2 Random experiments and relative frequencies2.2 Axioms of probability; 2.2.1 Sample space; 2.2.2 Event; 2.2.3 Probability measure; 2.2.4 Properties of probability measure; 2.3 Bernoulli trials and Bernoulli's theorem; 2.4 Conditional probability, Bayes' theorem, and statistical independence; 2.4.1 Joint probability and conditional probability; 2.4.2 Bayes' theorem; 2.4.2.1 Frequentist probabilities and Bayesian probabilities; 2.4.3 Statistical independence of events; 2.5 Summary of Chapter 2; 2.6 Discussion and further reading; 2.7 Problems; 3 Discrete random variables.
- 3.1 Random variables3.1.1 Distribution function; 3.1.2 Two random variables and joint distribution function; 3.2 Discrete random variables and probability distributions; 3.2.1 Joint and conditional probability distributions; 3.2.2 Moments, central moments, and variance; 3.2.3 Covariance and correlation coefficient; 3.3 Important probability distributions; 3.3.1 Bernoulli distribution and binomial distribution; 3.3.2 Geometric distribution; 3.3.3 Poisson distribution; 3.3.4 Negative binomial (or Pascal) distribution; 3.3.4.1 Shifted negative binomial distribution.
- 3.3.5 Zipf's law and zeta distribution3.3.5.1 Euler and Riemann zeta functions; 3.4 Summary of Chapter 3; 3.5 Discussion and further reading; 3.6 Problems; 4 Continuous random variables; 4.1 Continuous random variables; 4.1.1 Distribution function and probability density function; 4.1.2 Expectation, moments, central moments, and variance; 4.2 Important continuous random variables and their distributions; 4.2.1 Uniform distribution; 4.2.2 Exponential distribution; 4.2.3 Gamma distribution; 4.2.4 Normal (or Gaussian) distribution; 4.2.4.1 Moments of the unit normal distribution.
- 4.2.4.2 The normal approximation to the binomial distribution and the DeMoivre-Laplace limit theorem4.