Cargando…

Machine Learning and Deep Learning Using Python and TensorFlow /

This book provides you with an in-depth treatment of some advanced machine learning methods such as random forests, boosting, and neural networks.

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autores principales: Kadre, Shailendra (Autor), Reddy Konasani, Venkata (Autor)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: New York, N.Y. : McGraw-Hill Education, [2021]
Edición:First edition.
Colección:McGraw-Hill's AccessEngineering.
Temas:
Acceso en línea:Texto completo

MARC

LEADER 00000nam a2200000 i 4500
001 MGH_AE9781260462302
003 IN-ChSCO
005 20210416124538.0
006 m|||||||||||||||||
007 cr |n||||||||n
008 210416s2021||||nyu|||||o|||||||||||eng||
010 |z  2020949936 
020 |a 9781260462302 (e-ISBN) 
020 |a 1260462307 (e-ISBN) 
020 |a 9781260462296 (print-ISBN) 
020 |a 1260462293 (print-ISBN) 
035 |a (OCoLC)1245575418 
040 |a IN-ChSCO  |b eng  |e rda 
041 0 |a eng 
050 4 |a QA76.73.P98 
072 7 |a TEC  |x 007000  |2 bisacsh 
082 0 4 |a 005.133  |2 23 
100 1 |a Kadre, Shailendra,  |e author. 
245 1 0 |a Machine Learning and Deep Learning Using Python and TensorFlow /  |c Shailendra Kadre, Venkata Reddy Konasani. 
250 |a First edition. 
264 1 |a New York, N.Y. :  |b McGraw-Hill Education,  |c [2021] 
264 4 |c ?2021 
300 |a 1 online resource (600 pages) :  |b 50 illustrations. 
336 |a text  |2 rdacontent 
337 |a computer  |2 rdamedia 
338 |a online resource  |2 rdacarrier 
490 1 |a McGraw-Hill's AccessEngineering 
504 |a Includes bibliographical references and index. 
505 0 |a Cover -- Title Page -- Copyright Page -- Dedication -- About the Authors -- Contents -- Acknowledgments -- Preface -- Chapter 1. Introduction to Machine Learning and Deep Learning -- 1.1 A Brief History of AI and Machine Learning -- 1.2 Building Blocks of a Machine Learning Project -- 1.3 Machine Learning Algorithms vs. Traditional Computer Programs -- 1.4 How Deep Learning Works -- 1.5 Machine Learning and Deep Learning Applications -- 1.6 The Organization of This Book -- 1.7 Prerequisites?Essential Mathematics -- 1.8 The Terminology You Should Know -- 1.9 Machine Learning?A Wider Outlook Will Certainly Help -- 1.10 Python and Its Potential as the Language of Machine Learning -- 1.11 About TensorFlow -- 1.12 Conclusion -- 1.13 References -- Chapter 2. Basics of Python Programming and Statistics -- 2.1 Introduction to Python -- 2.2 Getting Started with Python Coding -- 2.3 Types of Objects in Python -- 2.4 Python Packages -- 2.5 Conditions and Loops in Python -- 2.6 Data Handling and Pandas Deep Dive -- 2.7 Basic Descriptive Statistics -- 2.8 Data Exploration -- 2.9 Conclusion -- 2.10 Practice Problems -- 2.11 References -- Chapter 3. Regression and Logistic Regression -- 3.1 What Is Regression? -- 3.2 Regression Model Building -- 3.3 R-Squared -- 3.4 Multiple Regression -- 3.5 Multicollinearity in Regression -- 3.6 Individual Impact of the Variables in Regression -- 3.7 Steps Needed in Building a Regression Model -- 3.8 Logistic Regression Model -- 3.9 Logistic Regression Model Building -- 3.10 Accuracy of Logistic Regression Line -- 3.11 Multiple Logistic Regression Line -- 3.12 Multicollinearity in Logistic Regression -- 3.13 Individual Impact of the Variables -- 3.14 Steps in Building a Logistic Regression Model -- 3.15 Linear vs. Logistic Regression Comparison -- 3.16 Conclusion -- 3.17 Practice Problems -- 3.18 Reference -- Chapter 4. Decision Trees -- 4.1 What Are Decision Trees? -- 4.2 Splitting Criterion Metrics: Entropy and Information Gain -- 4.3 Decision Tree Algorithm -- 4.4 Case Study: Contact Center Customer Segmentation -- 4.5 The Problem of Overfitting -- 4.6 Pruning of Decision Trees -- 4.7 The Challenge of Underfitting -- 4.8 Binary Search on Pruning Parameters -- 4.9 More Pruning Parameters -- 4.10 Steps in Building a Decision Tree Model -- 4.11 Conclusion -- 4.12 Practice Problems -- Chapter 5. Model Selection and Cross-Validation -- 5.1 Steps in Building a Model -- 5.2 Model Validation Measures: Regression -- 5.3 Case Study: House Sales in King County, Washington -- 5.4 Model Validation Measures: Classification -- 5.5 Bias-Variance Trade-Off -- 5.6 Cross-Validation -- 5.7 Feature Engineering Tips and Tricks -- 5.8 Dealing with Class Imbalance -- 5.9 Conclusion -- 5.10 Practice Problems -- 5.11 References -- Chapter 6. Cluster Analysis -- 6.1 Unsupervised Learning -- 6.2 Distance Measure -- 6.3 K-Means Clustering Algorithm -- 6.4 Building K-Means Clusters -- 6.5 Deciding the Number of Clusters -- 6.6 Conclusion -- 6.7 Practice Problems -- 6.8 References -- Chapter 7. Random Forests and Boosting -- 7.1 Ensemble Models -- 7.2 Bagging -- 7.3 Random Forest -- 7.4 Case Study: Car Accidents Prediction -- 7.5 Boosting -- 7.6 AdaBoosting Algorithm -- 7.7 Gradient Boosting Algorithm -- 7.8 Case Study: Income Prediction from Census Data -- 7.9 Conclusion -- 7.10 Practice Problems -- 7.11 References -- Chapter 8. Artificial Neural Networks -- 8.1 Network Diagram for Logistic Regression -- 8.2 Concept of Decision Boundary -- 8.3 Multiple Decision Boundaries Problem -- 8.4 Multiple Decision Boundaries Solution -- 8.5 Neural Network Intuition -- 8.6 Neural Network Algorithm -- 8.7 The Concept of Gradient Descent -- 8.8 Case Study: Recognizing Handwritten Digits -- 8.9 Deep Neural Networks -- 8.10 Conclusion -- 8.11 Practice Problems -- 8.12 References -- Chapter 9. TensorFlow and Keras -- 9.1 Deep Neural Networks -- 9.2 Deep Learning Frameworks -- 9.3 Key Terms in TensorFlow -- 9.4 Model Building with TensorFlow -- 9.5 Keras -- 9.6 Conclusion -- 9.7 References -- Chapter 10. Deep Learning Hyperparameters -- 10.1 Regularization -- 10.2 Dropout Regularization -- 10.3 Early Stopping Method -- 10.4 Loss Functions -- 10.5 Activation Functions -- 10.6 Learning Rate -- 10.7 Optimizers -- 10.8 Conclusion -- Chapter 11. Convolutional Neural Networks -- 11.1 ANNs for Images -- 11.2 Filters -- 11.3 The Convolution Layer -- 11.4 Pooling Layer -- 11.5 CNN Architecture -- 11.6 Case Study: Sign Language Reading from Images -- 11.7 Scheming the Ideal CNN Architecture -- 11.8 Steps in Building a CNN Model -- 11.9 Conclusion -- 11.10 Practice Problems -- 11.11 References -- Chapter 12. Recurrent Neural Networks and Long Short-Term Memory -- 12.1 Cross-Sectional Data vs. Sequential Data -- 12.2 Models for Sequential Data -- 12.3 Case Study: Word Prediction -- 12.4 Recurrent Neural Networks -- 12.5 RNN for Long Sequences -- 12.6 Long Short-Term Memory -- 12.7 Sequence to Sequence Models -- 12.8 Case Study: Language Translation -- 12.9 Conclusion -- 12.10 Practice Problems -- 12.11 References -- Index. 
520 0 |a This book provides you with an in-depth treatment of some advanced machine learning methods such as random forests, boosting, and neural networks. 
530 |a Also available in print edition. 
533 |a Electronic reproduction.  |b New York, N.Y. :  |c McGraw Hill,   |d 2021.  |n Mode of access: World Wide Web.  |n System requirements: Web browser.  |n Access may be restricted to users at subscribing institutions. 
538 |a Mode of access: Internet via World Wide Web. 
546 |a In English. 
588 |a Description based on e-Publication PDF. 
650 0 |a Machine learning. 
655 0 |a Electronic books. 
700 1 |a Reddy Konasani, Venkata,  |e author. 
776 0 |i Print version:   |t Machine Learning and Deep Learning Using Python and TensorFlow.  |b First edition.  |d New York, N.Y. : McGraw-Hill Education, 2021  |w (OCoLC)1245422280 
830 0 |a McGraw-Hill's AccessEngineering. 
856 4 0 |u https://accessengineeringlibrary.uam.elogim.com/content/book/9781260462296  |z Texto completo