F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way /
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Otros Autores: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Birmingham, England ; Mumbai [India] :
Packt Publishing,
2016.
|
Colección: | Community experience distilled.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Cover
- Copyright
- Credits
- Foreword
- About the Author
- Acknowledgments
- About the Reviewers
- www.PacktPub.com
- Table of Contents
- Preface
- Chapter 1: Introduction to Machine Learning
- Objective
- Getting in touch
- Different areas where machine learning is being used
- Why use F#?
- Supervised machine learning
- Training and test dataset/corpus
- Some motivating real life examples of supervised learning
- Nearest Neighbour algorithm (a.k.a k-NN algorithm)
- Distance metrics
- Decision tree algorithms
- Unsupervised learning
- Machine learning frameworks
- Machine learning for fun and profit
- Recognizing handwritten digits
- your "Hello World" ML program
- How does this work?
- Summary
- Chapter 2: Linear Regression
- Objective
- Different types of linear regression algorithms
- APIs used
- Math.NET Numerics for F# 3.7.0
- Getting Math.NET
- Experimenting with Math.NET
- The basics of matrices and vectors (a short and sweet refresher)
- Creating a vector
- Creating a matrix
- Finding the transpose of a matrix
- Finding the inverse of a matrix
- Trace of a matrix
- QR decomposition of a matrix
- SVD of a matrix
- Linear regression method of least square
- Finding linear regression coefficients using F#
- Finding the linear regression coefficients using Math.NET
- Putting it together with Math.NET and FsPlot
- Multiple linear regression
- Multiple linear regression and variations using Math.NET
- Weighted linear regression
- Plotting the result of multiple linear regression
- Ridge regression
- Multivariate multiple linear regression
- Feature scaling
- Summary
- Chapter 3: Classification Techniques
- Objective
- Different classification algorithms you will learn
- Some interesting things you can do
- Binary classification using k-NN
- How does it work?.
- Finding cancerous cells using k-NN: a case study
- Understanding logistic regression
- The sigmoid function chart
- Binary classification using logistic regression (using Accord.NET)
- Multiclass classification using logistic regression
- How does it work?
- Multiclass classification using decision trees
- Obtaining and using WekaSharp
- How does it work?
- Predicting a traffic jam using a decision tree: a case study
- Challenge yourself!
- Summary
- Chapter 4: Information Retrieval
- Objective
- Different IR algorithms you will learn
- What interesting things can you do?
- Information retrieval using tf-idf
- Measures of similarity
- Generating a PDF from a histogram
- Minkowski family
- L1 family
- Intersection family
- Inner Product family
- Fidelity family or squared-chord family
- Squared L2 family
- Shannon's Entropy family
- Similarity of asymmetric binary attributes
- Some example usages of distance metrics
- Finding similar cookies using asymmetric binary similarity measures
- Grouping/clustering color images based on Canberra distance
- Summary
- Chapter 5: Collaborative Filtering
- Objective
- Different classification algorithms you will learn
- Vocabulary of collaborative filtering
- Baseline predictors
- Basis of User-User collaborative filtering
- Implementing basic user-user collaborative filtering using F#
- Code walkthrough
- Variations of gap calculations and similarity measures
- Item-item collaborative filtering
- Top-N recommendations
- Evaluating recommendations
- Prediction accuracy
- Confusion matrix (decision support)
- Ranking accuracy metrics
- Prediction-rating correlation
- Working with real movie review data (Movie Lens)
- Summary
- Chapter 6: Sentiment Analysis
- Objective
- What you will learn
- A baseline algorithm for SA using SentiWordNet lexicons.
- Handling negations
- Identifying praise or criticism with sentiment orientation
- Pointwise Mutual Information
- Using SO-PMI to find sentiment analysis
- Summary
- Chapter 7: Anomaly Detection
- Objective
- Different classification algorithms
- Some cool things you will do
- The different types of anomalies
- Detecting point anomalies using IQR (Interquartile Range)
- Detecting point anomalies using Grubb's test
- Grubb's test for multivariate data using Mahalanobis distance
- Code walkthrough
- Chi-squared statistic to determine anomalies
- Detecting anomalies using density estimation
- Strategy to convert a collective anomaly to a point anomaly problem
- Dealing with categorical data in collective anomalies
- Summary
- Index.