"The author keeps a firm grasp on the subject, going from a detailed description of what hyperparameter tuning is to the effective ways to use it. ... this book would be most useful to scholars and professionals working on machine learning models. Readers looking for implementational assistance with the performance of their models will be the best fit ... ." (Niraj Singh, Computing Reviews, December 2, 2022)
Chapter 1: Hyperparameters
Chapter Goal: To introduce what hyperparameters are, how they can affect the
model training. Also gives an intuition of how hyperparameter affects general machine
learning algorithms, and what value should we choose as per the training dataset.
Sub - Topics
1. Introduction to hyperparameters.
2. Why do we need to tune hyperparameters
3. Specific algorithms and their hyperparameters
4. Cheatsheet for deciding Hyperparameter of some specific Algorithms.
Chapter 2: Brute Force Hyperparameter Tuning
Chapter Goal: To understand the commonly used classical hyperparameter tuning
methods and implement them from scratch, as well as use the Scikit-Learn library to do so.
Sub - Topics:
1. Hyperparameter tuning
2. Exhaustive hyperparameter tuning methods
3. Grid search
4. Random search
5. Evaluation of models while tuning hyperparameters.
Chapter Goal: It is based on a hypothesis of how, based on certain properties of dataset, one can train neural networks on metadata and generate hyperparameters for new datasets. It also summarizes how these newer methods of Hyperparameter Tuning can help AI to develop further.
Sub - Topics:
1. Generating Metadata
2. Training HG-cGANs
3. AI and hyperparameter tuning
Tanay is a deep learning engineer and researcher, who graduated in 2019 in Bachelor of Technology from SMVDU, J&K. He is currently working at Curl Hg on SARA, an OCR platform. He is also advisor to Witooth Dental Services and Technologies. He started his career at MateLabs working on an AutoML Platform, Mateverse. He has worked extensively on hyperparameter optimization. He has also delivered talks on hyperparameter optimization at conferences including PyData, Delhi and PyCon, India.
Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods.
This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you’ll discuss Bayesian optimization for hyperparameter search, which learns from its previous history.
The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you’ll focus on different aspects such as creation of search spaces and distributed optimization of these libraries.
Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script.
Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work.
You will:
Discover how changes in hyperparameters affect the model’s performance.
Apply different hyperparameter tuning algorithms to data science problems
Work with Bayesian optimization methods to create efficient machine learning and deep learning models
Distribute hyperparameter optimization using a cluster of machines
Approach automated machine learning using hyperparameter optimization