Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Hyperparameter Optimization for Machine Learning
Introduction
Introduction (3:29)
Course curriculum (6:37)
Course aim and knowledge requirements (2:24)
Course material (1:45)
Jupyter notebooks
Presentations
Datasets
Set up your computer - required packages
FAQ
Hyperparameter Tuning - Overview
Parameters and Hyperparameters (11:14)
Hyperparameter Optimization (8:52)
Refer a friend program
Performance metrics
Performance Metrics - Introduction (1:17)
Classification Metrics (Optional) (8:08)
Regression Metrics (Optional) (3:41)
Scikit-learn metrics (6:29)
Creating your own metrics (9:05)
Using Scikit-learn metrics (1:56)
How are we doing?
Cross-Validation
Cross-Validation (9:15)
Bias vs Variance (Optional)
Cross-Validation schemes (13:55)
Estimating the model generalization error with CV - Demo (8:35)
Cross-Validation for Hyperparameter Tuning - Demo (7:33)
Special Cross-Validation schemes (7:07)
Group Cross-Validation - Demo (5:03)
Nested Cross-Validation (7:19)
Nested Cross-Validation - Demo (6:43)
How are we doing?
Basic Search Algorithms
Basic Search Algorithms - Introduction (5:10)
Manual Search (6:35)
Grid Search (3:21)
Grid Search - Demo (7:50)
Grid Search with different hyperparameter spaces (2:18)
Random Search (7:34)
Random Search with Scikit-learn (5:37)
Random Search with Scikit-Optimize (7:30)
Random Search with Hyperopt (11:06)
More examples
How are we doing?
Bayesian Optimization
Sequential Search (5:49)
Bayesian Optimization (5:10)
Bayesian Inference - Introduction (7:11)
Joint and Conditional Probabilities (7:40)
Bayes Rule (12:02)
Sequential Model-Based Optimization (15:54)
Gaussian Distribution (7:28)
Multivariate Gaussian Distribution (16:22)
Gaussian Process (14:47)
Kernels (6:41)
Acquisition Functions (13:44)
Additional Reading Resources
Scikit-Optimize - 1-Dimension (14:11)
Scikit-Optimize - Manual Search (5:20)
Scikit-Optimize - Automatic Search (4:03)
Scikit-Optimize - Alternative Kernel (3:24)
Scikit-Optimize - Neuronal Networks (14:17)
Scikit-Optimize - CNN - Search Analysis (6:00)
Other SMBO Algorithms
Other SMBO Algorithms (4:11)
SMAC (6:14)
SMAC Demo (11:04)
Tree-structured Parzen Estimators - TPE (4:00)
TPE Procedure (8:08)
TPE hyperparameters (4:39)
TPE - why tree-structured? (4:29)
TPE with Hyperopt (6:02)
Discussion: Bayesian Optimization and Basic Search (13:30)
How are we doing?
Multi-fidelity Optimization
Multi-fidelity Optimization (10:39)
Scikit-Optimize
Scikit-Optimize (5:45)
Section content (2:10)
Hyperparameter Distributions (4:37)
Defining the hyperparameter space (2:36)
Defining the objective function (1:59)
Random search (5:12)
Bayesian search with Gaussian processes (5:14)
Bayesian search with Random Forests (2:53)
Bayesian search with GBMs (3:03)
Parallelizing a Bayesian search (2:53)
Bayesian search with Scikit-learn wrapper (4:03)
Changing the kernel of a Gaussian Process (3:24)
Optimizing xgboost
Optimizing Hyperparameters of a CNN (14:17)
Analyzing the CNN search (6:00)
Hyperopt
Hyperopt (8:05)
Section content (1:50)
Search space configuration and distributions (14:48)
Sampling from nested spaces (4:28)
Search algorithms (7:52)
Evaluating the search (8:34)
Optimizing multiple ML models simultaneously (9:31)
Optimizing Hyperparameters of a CNN
References
Optuna
Optuna (4:58)
Optuna main functions (7:45)
Section content (1:00)
Search algorithms (7:38)
Optimizing multiple ML models with simultaneously (7:21)
Optimizing hyperparameters of a CNN (9:52)
Optimizing a CNN - extended (4:48)
Evaluating the search with Optuna's built in functions (9:41)
References
More examples
Moving Forward
Congratulations
Next steps
How did we do?
Sampling from nested spaces
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock