Cargando…

Hands-on mathematics for deep learning : build a solid mathematical foundation for training efficient deep neural networks /

The main aim of this book is to make the advanced mathematical background accessible to someone with a programming background. This book will equip the readers with not only deep learning architectures but the mathematics behind them. With this book, you will understand the relevant mathematics that...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Dawani, Jay (Autor)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Birmingham : Packt Publishing, 2020.
Temas:
Acceso en línea:Texto completo (Requiere registro previo con correo institucional)
Tabla de Contenidos:
  • Intro
  • Title Page
  • Copyright and Credits
  • About Packt
  • Contributors
  • Table of Contents
  • Preface
  • Section 1: Essential Mathematics for Deep Learning
  • Linear Algebra
  • Comparing scalars and vectors
  • Linear equations
  • Solving linear equations in n-dimensions
  • Solving linear equations using elimination
  • Matrix operations
  • Adding matrices
  • Multiplying matrices
  • Inverse matrices
  • Matrix transpose
  • Permutations
  • Vector spaces and subspaces
  • Spaces
  • Subspaces
  • Linear maps
  • Image and kernel
  • Metric space and normed space
  • Inner product space
  • Matrix decompositions
  • Determinant
  • Eigenvalues and eigenvectors
  • Trace
  • Orthogonal matrices
  • Diagonalization and symmetric matrices
  • Singular value decomposition
  • Cholesky decomposition
  • Summary
  • Vector Calculus
  • Single variable calculus
  • Derivatives
  • Sum rule
  • Power rule
  • Trigonometric functions
  • First and second derivatives
  • Product rule
  • Quotient rule
  • Chain rule
  • Antiderivative
  • Integrals
  • The fundamental theorem of calculus
  • Substitution rule
  • Areas between curves
  • Integration by parts
  • Multivariable calculus
  • Partial derivatives
  • Chain rule
  • Integrals
  • Vector calculus
  • Derivatives
  • Vector fields
  • Inverse functions
  • Summary
  • Probability and Statistics
  • Understanding the concepts in probability
  • Classical probability
  • Sampling with or without replacement
  • Multinomial coefficient
  • Stirling's formula
  • Independence
  • Discrete distributions
  • Conditional probability
  • Random variables
  • Variance
  • Multiple random variables
  • Continuous random variables
  • Joint distributions
  • More probability distributions
  • Normal distribution
  • Multivariate normal distribution
  • Bivariate normal distribution
  • Gamma distribution
  • Essential concepts in statistics
  • Estimation
  • Mean squared error
  • Sufficiency
  • Likelihood
  • Confidence intervals
  • Bayesian estimation
  • Hypothesis testing
  • Simple hypotheses
  • Composite hypothesis
  • The multivariate normal theory
  • Linear models
  • Hypothesis testing
  • Summary
  • Optimization
  • Understanding optimization and it's different types
  • Constrained optimization
  • Unconstrained optimization
  • Convex optimization
  • Convex sets
  • Affine sets
  • Convex functions
  • Optimization problems
  • Non-convex optimization
  • Exploring the various optimization methods
  • Least squares
  • Lagrange multipliers
  • Newton's method
  • The secant method
  • The quasi-Newton method
  • Game theory
  • Descent methods
  • Gradient descent
  • Stochastic gradient descent
  • Loss functions
  • Gradient descent with momentum
  • The Nesterov's accelerated gradient
  • Adaptive gradient descent
  • Simulated annealing
  • Natural evolution
  • Exploring population methods
  • Genetic algorithms
  • Particle swarm optimization
  • Summary
  • Graph Theory
  • Understanding the basic concepts and terminology