|
|
|
|
LEADER |
00000cam a22000007i 4500 |
001 |
OR_on1348103143 |
003 |
OCoLC |
005 |
20231017213018.0 |
006 |
m o d |
007 |
cr cnu---unuuu |
008 |
221018s2022 nyua o 001 0 eng d |
040 |
|
|
|a ORMDA
|b eng
|e rda
|e pn
|c ORMDA
|d EBLCP
|d OCLCQ
|d N$T
|d OCLCF
|d OCLCO
|
019 |
|
|
|a 1353219282
|
020 |
|
|
|a 9781617298349
|q (electronic bk.)
|
020 |
|
|
|a 1617298344
|q (electronic bk.)
|
020 |
|
|
|z 1617298344
|
020 |
|
|
|a 9781638356738
|q (electronic bk.)
|
020 |
|
|
|a 1638356734
|q (electronic bk.)
|
029 |
1 |
|
|a AU@
|b 000072848722
|
035 |
|
|
|a (OCoLC)1348103143
|z (OCoLC)1353219282
|
037 |
|
|
|a 9781617298349
|b O'Reilly Media
|
050 |
|
4 |
|a Q325.5
|
082 |
0 |
4 |
|a 006.3/1
|2 23/eng/20221018
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Ganegedara, Thushan,
|e author.
|
245 |
1 |
0 |
|a TensorFlow in action /
|c Thushan Ganegedara.
|
264 |
|
1 |
|a Shelter Island :
|b Manning Publications,
|c [2022]
|
300 |
|
|
|a 1 online resource (1 volume.) :
|b illustrations
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
500 |
|
|
|a Includes index.
|
500 |
|
|
|a Covers TensorFlow version 2.9.
|
520 |
|
|
|a In TensorFlow in Action, you'll dig into the newest version of Google's amazing TensorFlow framework as you learn to create incredible deep learning applications. Author Thushan Ganegedara uses quirky stories, practical examples, and behind-the-scenes explanations to demystify concepts otherwise trapped in dense academic papers. As you dive into modern deep learning techniques like transformer and attention models, you’ll benefit from the unique insights of a top StackOverflow contributor for deep learning and NLP.
|
588 |
|
|
|a Description based on print version record.
|
505 |
0 |
|
|a Intro -- TensorFlow in Action -- Copyright -- dedication -- contents -- front matter -- preface -- acknowledgments -- about this book -- Who should read this book? -- How this book is organized: A roadmap -- About the code -- liveBook discussion forum -- about the author -- about the cover illustration -- Part 1 Foundations of TensorFlow 2 and deep learning -- 1 The amazing world of TensorFlow -- 1.1 What is TensorFlow? -- 1.1.1 An overview of popular components of TensorFlow -- 1.1.2 Building and deploying a machine learning model -- 1.2 GPU vs. CPU -- 1.3 When and when not to use TensorFlow
|
505 |
8 |
|
|a 1.3.1 When to use TensorFlow -- 1.3.2 When not to use TensorFlow -- 1.4 What will this book teach you? -- 1.4.1 TensorFlow fundamentals -- 1.4.2 Deep learning algorithms -- 1.4.3 Monitoring and optimization -- 1.5 Who is this book for? -- 1.6 Should we really care about Python and TensorFlow 2? -- Summary -- 2 TensorFlow 2 -- 2.1 First steps with TensorFlow 2 -- 2.1.1 How does TensorFlow operate under the hood? -- 2.2 TensorFlow building blocks -- 2.2.1 Understanding tf.Variable -- 2.2.2 Understanding tf.Tensor -- 2.2.3 Understanding tf.Operation
|
505 |
8 |
|
|a 2.3 Neural network-related computations in TensorFlow -- 2.3.1 Matrix multiplication -- 2.3.2 Convolution operation -- 2.3.3 Pooling operation -- Summary -- Answers to exercises -- 3 Keras and data retrieval in TensorFlow 2 -- 3.1 Keras model-building APIs -- 3.1.1 Introducing the data set -- 3.1.2 The Sequential API -- 3.1.3 The functional API -- 3.1.4 The sub-classing API -- 3.2 Retrieving data for TensorFlow/Keras models -- 3.2.1 tf.data API -- 3.2.2 Keras DataGenerators -- 3.2.3 tensorflow-datasets package -- Summary -- Answers to exercises -- 4 Dipping toes in deep learning
|
505 |
8 |
|
|a 4.1 Fully connected networks -- 4.1.1 Understanding the data -- 4.1.2 Autoencoder model -- 4.2 Convolutional neural networks -- 4.2.1 Understanding the data -- 4.2.2 Implementing the network -- 4.3 One step at a time: Recurrent neural networks (RNNs) -- 4.3.1 Understanding the data -- 4.3.2 Implementing the model -- 4.3.3 Predicting future CO2 values with the trained model -- Summary -- Answers to exercises -- 5 State-of-the-art in deep learning: Transformers -- 5.1 Representing text as numbers -- 5.2 Understanding the Transformer model -- 5.2.1 The encoder-decoder view of the Transformer
|
505 |
8 |
|
|a 5.2.2 Diving deeper -- 5.2.3 Self-attention layer -- 5.2.4 Understanding self-attention using scalars -- 5.2.5 Self-attention as a cooking competition -- 5.2.6 Masked self-attention layers -- 5.2.7 Multi-head attention -- 5.2.8 Fully connected layer -- 5.2.9 Putting everything together -- Summary -- Answers to exercises -- Part 2 Look ma, no hands! Deep networks in the real world -- 6 Teaching machines to see: Image classification with CNNs -- 6.1 Putting the data under the microscope: Exploratory data analysis -- 6.1.1 The folder/file structure -- 6.1.2 Understanding the classes in the data set
|
590 |
|
|
|a O'Reilly
|b O'Reilly Online Learning: Academic/Public Library Edition
|
650 |
|
0 |
|a Machine learning.
|
650 |
|
0 |
|a Artificial intelligence.
|
650 |
|
2 |
|a Artificial Intelligence
|
650 |
|
2 |
|a Machine Learning
|
650 |
|
6 |
|a Apprentissage automatique.
|
650 |
|
6 |
|a Intelligence artificielle.
|
650 |
|
7 |
|a artificial intelligence.
|2 aat
|
650 |
|
7 |
|a Artificial intelligence
|2 fast
|
650 |
|
7 |
|a Machine learning
|2 fast
|
776 |
0 |
8 |
|i Print version:
|a Ganegedara, Thushan.
|t Tensorflow 2.0 in action.
|d Shelter Island : Manning Publications, 2022
|z 9781617298349
|w (OCoLC)1289279869
|
856 |
4 |
0 |
|u https://learning.oreilly.com/library/view/~/9781617298349/?ar
|z Texto completo (Requiere registro previo con correo institucional)
|
938 |
|
|
|a ProQuest Ebook Central
|b EBLB
|n EBL7114298
|
938 |
|
|
|a EBSCOhost
|b EBSC
|n 3376799
|
994 |
|
|
|a 92
|b IZTAP
|