Hands-On Ensemble Learning with R : a Beginner's Guide to Combining the Power of Machine Learning Algorithms Using Ensemble Techniques.
Chapter 8: Ensemble Diagnostics; Technical requirements; What is ensemble diagnostics?; Ensemble diversity; Numeric prediction; Class prediction; Pairwise measure; Disagreement measure; Yule's or Q-statistic; Correlation coefficient measure; Cohen's statistic; Double-fault measure; Interra...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Birmingham :
Packt Publishing Ltd,
2018.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Cover; Copyright; Contributors; Table of Contents; Preface; Chapter 1: Introduction to Ensemble Techniques; Datasets; Hypothyroid; Waveform; German Credit; Iris; Pima Indians Diabetes; US Crime; Overseas visitors; Primary Biliary Cirrhosis; Multishapes; Board Stiffness; Statistical/machine learning models; Logistic regression model; Logistic regression for hypothyroid classification; Neural networks; Neural network for hypothyroid classification; Naïve Bayes classifier; Naïve Bayes for hypothyroid classification; Decision tree; Decision tree for hypothyroid classification.
- Support vector machinesSVM for hypothyroid classification; The right model dilemma!; An ensemble purview; Complementary statistical tests; Permutation test; Chi-square and McNemar test; ROC test; Summary; Chapter 2: Bootstrapping; Technical requirements; The jackknife technique; The jackknife method for mean and variance; Pseudovalues method for survival data; Bootstrap
- a statistical method; The standard error of correlation coefficient; The parametric bootstrap; Eigen values; Rule of thumb; The boot package; Bootstrap and testing hypotheses; Bootstrapping regression models.
- Bootstrapping survival models*Bootstrapping time series models*; Summary; Chapter 3: Bagging; Technical requirements; Classification trees and pruning; Bagging; k-NN classifier; Analyzing waveform data; k-NN bagging; Summary; Chapter 4: Random Forests; Technical requirements; Random Forests; Variable importance; Proximity plots; Random Forest nuances; Comparisons with bagging; Missing data imputation; Clustering with Random Forest; Summary; Chapter 5: The Bare Bones Boosting Algorithms; Technical requirements; The general boosting algorithm; Adaptive boosting; Gradient boosting.
- Building it from scratchSquared-error loss function; Using the adabag and gbm packages; Variable importance; Comparing bagging, random forests, and boosting; Summary; Chapter 6: Boosting Refinements; Technical requirements; Why does boosting work?; The gbm package; Boosting for count data; Boosting for survival data; The xgboost package; The h2o package; Summary; Chapter 7: The General Ensemble Technique; Technical requirements; Why does ensembling work?; Ensembling by voting; Majority voting; Weighted voting; Ensembling by averaging; Simple averaging; Weight averaging; Stack ensembling.