LouisTable of contents : Cover Title Page Copyright and Credits Contributors Table of Contents Preface Section 1:The Methods Chapter 1: Evaluating Machine Learning Models Technical requirements Understanding the concept of overfitting Creating training, validation, and test sets Exploring random and stratified splits Discovering repeated k-fold cross-validation Discovering Leave-One-Out cross-validation Discovering LPO cross-validation Discovering time-series cross-validation Summary Further reading Chapter 2: Introducing Hyperparameter Tuning What is hyperparameter tuning? Demystifying hyperparameters versus parameters Understanding hyperparameter space and distributions Summary Chapter 3: Exploring Exhaustive Search Understanding manual search Understanding grid search Understanding random search Summary Chapter 4: Exploring Bayesian Optimization Introducing BO Understanding BO GP Understanding SMAC Understanding TPE Understanding Metis Summary Chapter 5: Exploring Heuristic Search Understanding simulated annealing Understanding genetic algorithms Understanding particle swarm optimization Understanding Population-Based Training Summary Chapter 6: Exploring Multi-Fidelity Optimization Introducing MFO Understanding coarse-to-fine search Understanding successive halving Understanding hyper band Understanding BOHB Summary Section 2:The Implementation Chapter 7: Hyperparameter Tuningvia Scikit Technical requirements Introducing Scikit Implementing Grid Search Implementing Random Search Implementing Coarse-to-Fine Search Implementing Successive Halving Implementing Hyper Band Implementing Bayesian Optimization Gaussian Process Implementing Bayesian Optimization Random Forest Implementing Bayesian Optimization Gradient Boosted Trees Summary Chapter 8: Hyperparameter Tuning via Hyperopt Technical requirements Introducing Hyperopt Implementing Random Search Implementing Tree-structured Parzen Estimators Implementing Adaptive TPE Implementing simulated annealing Summary Chapter 9: Hyperparameter Tuning via Optuna Technical requirements Introducing Optuna Implementing TPE Implementing Random Search Implementing Grid Search Implementing Simulated Annealing Implementing Successive Halving Implementing Hyperband Summary Chapter 10: Advanced Hyperparameter Tuning with DEAP and Microsoft NNI Technical requirements Introducing DEAP Implementing the Genetic Algorithm Implementing Particle Swarm Optimization Introducing Microsoft NNI Implementing Grid Search Implementing Random Search Implementing Tree-structured Parzen Estimators Implementing Sequential Model Algorithm Configuration Implementing Bayesian Optimization Gaussian Process Implementing Metis Implementing Simulated Annealing Implementing Hyper Band Implementing Bayesian Optimization Hyper Band Implementing Population-Based Training Summary Section 3:Putting Things into Practice Chapter 11: Understanding the Hyperparameters of Popular Algorithms Exploring Random Forest hyperparameters Exploring XGBoost hyperparameters Exploring LightGBM hyperparameters Exploring CatBoost hyperparameters Exploring SVM hyperparameters Exploring artificial neural network hyperparameters Summary Chapter 12: Introducing Hyperparameter Tuning Decision Map Getting familiar with HTDM Case study 1 ? using HTDM with a CatBoost classifier Case study 2 ? using HTDM with a conditional hyperparameter space Case study 3 ? using HTDM with prior knowledge of the hyperparameter values Summary Chapter 13: Tracking Hyperparameter Tuning Experiments Technical requirements Revisiting the usual practices Using a built-in Python dictionary Using a configuration file Using additional modules Exploring Neptune Exploring scikit-optimize Exploring Optuna Exploring Microsoft NNI Exploring MLflow Summary Chapter 14: Conclusions and Next Steps Revisiting hyperparameter tuning methods and packages Revisiting HTDM What?s next? Summary Index Other Books You May Enjoy Owen
digsell.net collection
[PDF] Hyperparameter Tuning with Python: Boost your machine learning model?s performance via hyperparameter tuning
$9.99
Reviews
There are no reviews yet.