Cargando…

Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. /

This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deploy...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Wade, Corey
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Packt Publishing, 2020.
Temas:
Acceso en línea:Texto completo (Requiere registro previo con correo institucional)

MARC

LEADER 00000cam a2200000 a 4500
001 OR_on1203953302
003 OCoLC
005 20231017213018.0
006 m o d
007 cr |||||||||||
008 201016s2020 enk o 000 0 eng d
040 |a UKAHL  |b eng  |e pn  |c UKAHL  |d YDX  |d N$T  |d OCLCO  |d OCLCF  |d UKMGB  |d EBLCP  |d OCLCO  |d OCLCQ  |d KSU  |d OCLCQ  |d OCLCO 
015 |a GBC0G8283  |2 bnb 
016 7 |a 019991769  |2 Uk 
019 |a 1201298605 
020 |a 1839213809 
020 |a 9781839213809  |q (electronic bk.) 
020 |z 9781839218354  |q (pbk.) 
029 1 |a UKMGB  |b 019991769 
029 1 |a AU@  |b 000068339102 
029 1 |a AU@  |b 000068846288 
035 |a (OCoLC)1203953302  |z (OCoLC)1201298605 
037 |a 9781839213809  |b Packt Publishing 
050 4 |a Q325.5 
082 0 4 |a 006.31  |2 23 
049 |a UAMI 
100 1 |a Wade, Corey. 
245 1 0 |a Hands-On Gradient Boosting with XGBoost and scikit-learn :  |b Perform accessible machine learning and extreme gradient boosting with Python. /  |c Corey Wade, Wade. 
260 |b Packt Publishing,  |c 2020. 
300 |a 1 online resource 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
520 |a This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deployment, discover tips from Kaggle masters, and much more! 
505 0 |a Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation 
505 8 |a Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters 
505 8 |a Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary 
505 8 |a Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset 
505 8 |a Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models 
590 |a O'Reilly  |b O'Reilly Online Learning: Academic/Public Library Edition 
650 0 |a Machine learning. 
650 0 |a Python (Computer program language) 
650 6 |a Apprentissage automatique. 
650 6 |a Python (Langage de programmation) 
650 7 |a Machine learning  |2 fast 
650 7 |a Python (Computer program language)  |2 fast 
776 0 8 |i Print version :  |z 9781839218354 
856 4 0 |u https://learning.oreilly.com/library/view/~/9781839218354/?ar  |z Texto completo (Requiere registro previo con correo institucional) 
938 |a Askews and Holts Library Services  |b ASKH  |n AH37757154 
938 |a ProQuest Ebook Central  |b EBLB  |n EBL6414484 
938 |a EBSCOhost  |b EBSC  |n 2655459 
938 |a YBP Library Services  |b YANK  |n 301634247 
994 |a 92  |b IZTAP