Cargando…

Hands-on ensemble learning with Python : build highly optimized ensemble machine learning models using scikit-learn and Keras /

Ensemble learning can provide the necessary methods to improve the accuracy and performance of existing models. In this book, you'll understand how to combine different machine learning algorithms to produce more accurate results from your models.

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autores principales: Kyriakides, George (Autor), Margaritis, Konstantinos G. (Autor)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Birmingham, UK : Packt Publishing, [2019]
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Section 1: Introduction and Required Software Tools; Chapter 1: A Machine Learning Refresher; Technical requirements; Learning from data; Popular machine learning datasets; Diabetes; Breast cancer; Handwritten digits; Supervised and unsupervised learning; Supervised learning; Unsupervised learning; Dimensionality reduction; Performance measures; Cost functions; Mean absolute error; Mean squared error; Cross entropy loss; Metrics; Classification accuracy; Confusion matrix
  • Sensitivity, specificity, and area under the curvePrecision, recall, and the F1 score; Evaluating models; Machine learning algorithms; Python packages; Supervised learning algorithms; Regression; Support vector machines; Neural networks; Decision trees; K-Nearest Neighbors; K-means; Summary; Chapter 2: Getting Started with Ensemble Learning; Technical requirements; Bias, variance, and the trade-off; What is bias?; What is variance?; Trade-off; Ensemble learning; Motivation; Identifying bias and variance; Validation curves; Learning curves; Ensemble methods; Difficulties in ensemble learning
  • Weak or noisy dataUnderstanding interpretability; Computational cost; Choosing the right models; Summary; Section 2: Non-Generative Methods; Chapter 3: Voting; Technical requirements; Hard and soft voting; Hard voting; Soft voting; ​Python implementation; Custom hard voting implementation; Analyzing our results using Python; Using scikit-learn; Hard voting implementation; Soft voting implementation; Analyzing our results; Summary; Chapter 4: Stacking; Technical requirements; Meta-learning; Stacking; Creating metadata; Deciding on an ensemble's composition; Selecting base learners
  • Selecting the meta-learnerPython implementation; Stacking for regression; Stacking for classification; Creating a stacking regressor class for scikit-learn; Summary; Section 3: Generative Methods; Chapter 5: Bagging; Technical requirements; Bootstrapping; Creating bootstrap samples; Bagging; Creating base learners; Strengths and weaknesses; Python implementation; Implementation; Parallelizing the implementation; Using scikit-learn; Bagging for classification; Bagging for regression; Summary; Chapter 6: Boosting; Technical requirements; AdaBoost; Weighted sampling; Creating the ensemble
  • Implementing AdaBoost in PythonStrengths and weaknesses; Gradient boosting; Creating the ensemble; Further reading; Implementing gradient boosting in Python; Using scikit-learn; Using AdaBoost; Using gradient boosting; XGBoost; Using XGBoost for regression; Using XGBoost for classification; Other boosting libraries; Summary; Chapter 7: Random Forests; Technical requirements; Understanding random forest trees; Building trees; Illustrative example; Extra trees; Creating forests; Analyzing forests; Strengths and weaknesses; Using scikit-learn; Random forests for classification