|
|
|
|
LEADER |
00000cam a2200000Mi 4500 |
001 |
EBOOKCENTRAL_on1080997625 |
003 |
OCoLC |
005 |
20240329122006.0 |
006 |
m o d |
007 |
cr |n|---||||| |
008 |
190105s2018 enk o 000 0 eng d |
040 |
|
|
|a EBLCP
|b eng
|e pn
|c EBLCP
|d MERUC
|d YDX
|d CHVBK
|d OCLCQ
|d LOA
|d K6U
|d OCLCQ
|d OCLCL
|
019 |
|
|
|a 1080585692
|
020 |
|
|
|a 9781789951721
|
020 |
|
|
|a 1789951720
|
029 |
1 |
|
|a AU@
|b 000065067258
|
029 |
1 |
|
|a CHNEW
|b 001039967
|
029 |
1 |
|
|a CHVBK
|b 559036310
|
035 |
|
|
|a (OCoLC)1080997625
|z (OCoLC)1080585692
|
050 |
|
4 |
|a QA76.73.P98
|b .B663 2018
|
082 |
0 |
4 |
|a 005.133
|2 23
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Bonaccorso, Giuseppe.
|
245 |
1 |
0 |
|a Python :
|b Expert Machine Learning Systems and Intelligent Agents Using Python.
|
260 |
|
|
|a Birmingham :
|b Packt Publishing Ltd,
|c 2018.
|
300 |
|
|
|a 1 online resource (748 pages)
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
588 |
0 |
|
|a Print version record.
|
505 |
0 |
|
|a Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Machine Learning Model Fundamentals; Models and data; Zero-centering and whitening; Training and validation sets; Cross-validation; Features of a machine learning model; Capacity of a model; Vapnik-Chervonenkis capacity; Bias of an estimator; Underfitting; Variance of an estimator; Overfitting; The Cramér-Rao bound; Loss and cost functions; Examples of cost functions; Mean squared error; Huber cost function; Hinge cost function; Categorical cross-entropy; Regularization; Ridge; Lasso
|
505 |
8 |
|
|a ElasticNetEarly stopping; Summary; Chapter 2: Introduction to Semi-Supervised Learning; Semi-supervised scenario; Transductive learning; Inductive learning; Semi-supervised assumptions; Smoothness assumption; Cluster assumption; Manifold assumption; Generative Gaussian mixtures; Example of a generative Gaussian mixture; Weighted log-likelihood; Contrastive pessimistic likelihood estimation; Example of contrastive pessimistic likelihood estimation; Semi-supervised Support Vector Machines (S3VM); Example of S3VM; Transductive Support Vector Machines (TSVM); Example of TSVM; Summary
|
505 |
8 |
|
|a Chapter 3: Graph-Based Semi-Supervised LearningLabel propagation; Example of label propagation; Label propagation in Scikit-Learn; Label spreading; Example of label spreading; Label propagation based on Markov random walks; Example of label propagation based on Markov random walks; Manifold learning; Isomap; Example of Isomap; Locally linear embedding; Example of locally linear embedding; Laplacian Spectral Embedding; Example of Laplacian Spectral Embedding; t-SNE; Example of t-distributed stochastic neighbor embedding ; Summary; Chapter 4: Bayesian Networks and Hidden Markov Models
|
505 |
8 |
|
|a Conditional probabilities and Bayes' theoremBayesian networks; Sampling from a Bayesian network; Direct sampling; Example of direct sampling; A gentle introduction to Markov chains; Gibbs sampling; Metropolis-Hastings sampling; Example of Metropolis-Hastings sampling; Sampling example using PyMC3; Hidden Markov Models (HMMs); Forward-backward algorithm; Forward phase; Backward phase; HMM parameter estimation; Example of HMM training with hmmlearn; Viterbi algorithm; Finding the most likely hidden state sequence with hmmlearn; Summary; Chapter 5: EM Algorithm and Applications
|
505 |
8 |
|
|a MLE and MAP learningEM algorithm; An example of parameter estimation; Gaussian mixture; An example of Gaussian Mixtures using Scikit-Learn; Factor analysis; An example of factor analysis with Scikit-Learn; Principal Component Analysis; An example of PCA with Scikit-Learn; Independent component analysis; An example of FastICA with Scikit-Learn; Addendum to HMMs; Summary; Chapter 6: Hebbian Learning and Self-Organizing Maps; Hebb's rule; Analysis of the covariance rule; Example of covariance rule application; Weight vector stabilization and Oja's rule; Sanger's network
|
500 |
|
|
|a Example of Sanger's network
|
520 |
|
|
|a This Learning Path is your complete guide to quickly getting to grips with popular machine learning algorithms. You'll be introduced to the most widely used algorithms in supervised, unsupervised, and semi-supervised machine learning, and learn how to use them in the best possible manner. Ranging from Bayesian models to the MCMC algorithm to ...
|
590 |
|
|
|a ProQuest Ebook Central
|b Ebook Central Academic Complete
|
650 |
|
0 |
|a Python.
|
700 |
1 |
|
|a Fandango, Armando.
|
700 |
1 |
|
|a Shanmugamani, Rajalingappaa.
|
758 |
|
|
|i has work:
|a Python (Work)
|1 https://id.oclc.org/worldcat/entity/E39PCXmGTm4h9JYDpW6Dkj9TBP
|4 https://id.oclc.org/worldcat/ontology/hasWork
|
776 |
0 |
8 |
|i Print version:
|a Bonaccorso, Giuseppe.
|t Python: Advanced Guide to Artificial Intelligence.
|d Birmingham : Packt Publishing Ltd, ©2018
|z 9781789957211
|
856 |
4 |
0 |
|u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=5626921
|z Texto completo
|
938 |
|
|
|a ProQuest Ebook Central
|b EBLB
|n EBL5626921
|
938 |
|
|
|a YBP Library Services
|b YANK
|n 15914148
|
994 |
|
|
|a 92
|b IZTAP
|