Cargando…

Mastering probabilistic graphical models using Python : master probablistic graphical models by learning through real-world problems and illustrative code examples in Python /

If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autores principales: Ankan, Ankur (Autor), Panda, Abinash (Autor)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Birmingham, UK : Packt Publishing, 2015.
Colección:Community experience distilled.
Temas:
Acceso en línea:Texto completo (Requiere registro previo con correo institucional)
Tabla de Contenidos:
  • Cover; Copyright; Credits; About the Authors; About the Reviewers; www.PacktPub.com; Table of Contents; Preface; Chapter 1: Bayesian Network Fundamentals; Probability theory; Random variable; Independence and conditional independence; Installing tools; IPython; pgmpy; Representing independencies using pgmpy; Representing joint probability distributions using pgmpy; Conditional probability distribution; Representing CPDs using pgmpy; Graph theory; Nodes and edges; Walk, paths, and trails; Bayesian models; Representation; Factorization of a distribution over a network
  • Implementing Bayesian networks using pgmpyBayesian model representation; Reasoning pattern in Bayesian networks; D-separation; Direct connection; Indirect connection; Relating graphs and distributions; IMAP; IMAP to factorization; CPD representations; Deterministic CPDs; Context-specific CPDs; Tree CPD; Rule CPD; Summary; Chapter 2: Markov Network Fundamentals; Introducing the Markov network; Parameterizing a Markov network
  • factor; Factor operations; Gibbs distributions and Markov networks; The factor graph; Independencies in Markov networks; Constructing graphs from distributions
  • Bayesian networks and Markov networksConverting Bayesian models into Markov models; Converting Markov models into Bayesian models; Chordal graphs; Summary; Chapter 3: Inference
  • Asking Questions to Models; Inference; Complexity of inference; Variable elimination; Analysis of variable elimination; Finding elimination ordering; Using the chordal graph property of induced graphs; Minimum fill/size/weight/search; Belief propagation; Clique tree; Constructing a clique tree; Message passing; Clique tree calibration; Message passing with division; Factor division
  • Querying variables that are not in the same clusterMAP using variable elimination; Factor maximization; MAP using belief propagation; Finding the most probable assignment; Predictions from the model using pgmpy; A comparison of variable elimination and belief propagation; Summary; Chapter 4: Approximate Inference; The optimization problem; The energy function; Exact inference as an optimization; The propagation based approximation algorithm; Cluster graph belief propagation; Constructing cluster graphs; Pairwise Markov networks; Bethe cluster graph; Propagation with approximate messages
  • Message creationInference with approximate messages; Sum-product expectation propagation; Belief update propagation; Sampling-based approximate methods; Forward sampling; Conditional probability distribution; Likelihood weighting and importance sampling; Importance sampling; Importance sampling in Bayesian networks; Computing marginal probabilities; Ratio likelihood weighting; Normalized likelihood weighting; Markov chain Monte Carlo methods; Gibbs sampling; Markov chains; The multiple transitioning model; Using a Markov chain; Collapsed particles; Collapsed importance sampling; Summary
  • Chapter 5: Model Learning
  • Parameter Estimation in Bayesian Networks