How to implement a fully conneted network with Keras
Multi-class classification with the Zalando dataset in Keras
Gradient descent variation in practice with a real dataset
Weight initialization
How to compare the complexity of neural networks
How to estimate memory used by neural networks in Keras
Chapter 5: Advanced Optimizers
Subtopics:
An introduction to regularization
l_p norm
l_2 regularization
Weight decay when using regularization
Dropout
Early Stopping
Chapter 6
Chapter Title: Hyper-Parameter tuning
Subtopics:
Exponentially weighted averages
Momentum
RMSProp
Adam
Comparison of optimizers
Chapter 7
Chapter Title: Convolutional Neural Networks
Subtopics:
Introduction to Hyper-parameter tuning
Black box optimization
Grid Search
Random Search
Coarse to fine optimization
Sampling on logarithmic scale
Bayesian optimisation
Chapter 8
Chapter Title: Brief Introduction to Recurrent Neural Networks
Subtopics:
Theory of convolution
Pooling and padding
Building blocks of a CNN
Implementation of a CNN with Keras
Introduction to recurrent neural networks
Implementation of a RNN with Keras
Chapter 9: Autoencoders
Subtopics:
Feed Forward Autoencoders
Loss function in autoencoders
Reconstruction error
Application of autoencoders: dimensionality reduction
Application of autoencoders: Classification with latent features
Curse of dimensionality
Denoising autoencoders
Autoencoders with CNN
Chapter 10: Metric Analysis
Subtopics:
Human level performance and Bayes error
Bias
Metric analysis diagram
Training set overfitting
How to split your dataset
Unbalanced dataset: what can happen
K-fold cross validation
Manual metric analysis: an example
Chapter 11
Chapter Title: General Adversarial Networks (GANs)
Subtopics:
Introduction to GANs
The building blocks of GANs
An example of implementation of GANs in Keras
APPENDIX 1: Introduction to Keras
Subtopics:
Sequential model
Keras Layers
Functional APIs
Specifying loss functions
Putting all together and training a model
Callback functions
Save and load models
APPENDIX 2: Customizing Keras
Subtopics:
Custom callback functions
Custom training loops
Custom loss functions
APPENDIX 3: Symbols and Abbreviations
Umberto Michelucci is the founder and the chief AI scientist of TOELT – Advanced AI LAB LLC. He’s an expert in numerical simulation, statistics, data science, and machine learning. He has 15 years of practical experience in the fields of data warehouse, data science, and machine learning. His first book, Applied Deep Learning—A Case-Based Approach to Understanding Deep Neural Networks, was published in 2018. His second book, Convolutional and Recurrent Neural Networks Theory and Applications was published in 2019. He publishes his research regularly and gives lectures on machine learning and statistics at various universities. He holds a PhD in machine learning, and he is also a Google Developer Expert in Machine Learning based in Switzerland.
Understand how neural networks work and learn how to implement them using TensorFlow 2.0 and Keras. This new edition focuses on the fundamental concepts and at the same time on practical aspects of implementing neural networks and deep learning for your research projects.
This book is designed so that you can focus on the parts you are interested in. You will explore topics as regularization, optimizers, optimization, metric analysis, and hyper-parameter tuning. In addition, you will learn the fundamentals ideas behind autoencoders and generative adversarial networks.
All the code presented in the book will be available in the form of Jupyter notebooks which would allow you to try out all examples and extend them in interesting ways. A companion online book is available with the complete code for all examples discussed in the book and additional material more related to TensorFlow and Keras. All the code will be available in Jupyter notebook format and can be opened directly in Google Colab (no need to install anything locally) or downloaded on your own machine and tested locally.
You will:
• Understand the fundamental concepts of how neural networks work
• Learn the fundamental ideas behind autoencoders and generative adversarial networks
• Be able to try all the examples with complete code examples that you can expand for your own projects
• Have available a complete online companion book with examples and tutorials.
This book is for:
Readers with an intermediate understanding of machine learning, linear algebra, calculus, and basic Python programming.