Cargando…

Hands-on deep learning algorithms with Python : master deep learning algorithms with extensive math by implementing them using TensorFlow /

This book introduces basic-to-advanced deep learning algorithms used in a production environment by AI researchers and principal data scientists; it explains algorithms intuitively, including the underlying math, and shows how to implement them using popular Python-based deep learning libraries such...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Ravichandiran, Sudharsan (Autor)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Birmingham : Packt Publishing Ltd, 2019.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Cover; Title Page; Copyright and Credits; Dedication; About Packt; Contributors; Table of Contents; Preface; Section 1: Getting Started with Deep Learning; Chapter 1: Introduction to Deep Learning; What is deep learning?; Biological and artificial neurons; ANN and its layers; Input layer; Hidden layer; Output layer; Exploring activation functions; The sigmoid function; The tanh function; The Rectified Linear Unit function; The leaky ReLU function; The Exponential linear unit function; The Swish function; The softmax function; Forward propagation in ANN; How does ANN learn?
  • Debugging gradient descent with gradient checkingPutting it all together; Building a neural network from scratch; Summary; Questions; Further reading; Chapter 2: Getting to Know TensorFlow; What is TensorFlow?; Understanding computational graphs and sessions; Sessions; Variables, constants, and placeholders; Variables; Constants; Placeholders and feed dictionaries; Introducing TensorBoard; Creating a name scope; Handwritten digit classification using TensorFlow; Importing the required libraries; Loading the dataset; Defining the number of neurons in each layer; Defining placeholders
  • Forward propagationComputing loss and backpropagation; Computing accuracy; Creating summary; Training the model; Visualizing graphs in TensorBoard; Introducing eager execution; Math operations in TensorFlow; TensorFlow 2.0 and Keras; Bonjour Keras; Defining the model; Defining a sequential model; Defining a functional model; Compiling the model; Training the model; Evaluating the model; MNIST digit classification using TensorFlow 2.0; Should we use Keras or TensorFlow?; Summary; Questions; Further reading; Section 2: Fundamental Deep Learning Algorithms
  • Chapter 3: Gradient Descent and Its VariantsDemystifying gradient descent; Performing gradient descent in regression; Importing the libraries; Preparing the dataset; Defining the loss function; Computing the gradients of the loss function; Updating the model parameters; Gradient descent versus stochastic gradient descent; Momentum-based gradient descent; Gradient descent with momentum; Nesterov accelerated gradient; Adaptive methods of gradient descent; Setting a learning rate adaptively using Adagrad; Doing away with the learning rate using Adadelta
  • Overcoming the limitations of Adagrad using RMSPropAdaptive moment estimation; Adamax
  • Adam based on infinity-norm; Adaptive moment estimation with AMSGrad; Nadam
  • adding NAG to ADAM; Summary; Questions; Further reading; Chapter 4: Generating Song Lyrics Using RNN; Introducing RNNs; The difference between feedforward networks and RNNs; Forward propagation in RNNs; Backpropagating through time; Gradients with respect to the hidden to output weight, V; Gradients with respect to hidden to hidden layer weights, W; Gradients with respect to input to the hidden layer weight, U