Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Feature Selection for Machine Learning
Welcome
Introduction (4:03)
Course curriculum overview (3:33)
Course requirements (3:00)
Course aim (1:44)
Optional: How to approach this course
Course material (2:01)
The code | Jupyter notebooks
Download the data sets
FAQ: Data Science and Python programming
Feature selection
What is feature selection? (6:15)
Feature selection methods | Overview (6:19)
Filter methods (3:33)
Wrapper methods (5:42)
Embedded methods (3:52)
Moving Forward (4:05)
Open-source packages for feature selection (3:00)
Refer a friend program
Filter Methods | Basics
Constant, quasi constant, and duplicated features – Intro (4:02)
Constant features (7:53)
Quasi-constant features (7:07)
Duplicated features (5:23)
Install Feature-engine
Drop constant and quasi-constant with Feature-engine (4:20)
Drop duplicates with Feature-engine (5:23)
How are we doing?
Filter methods | Correlation
Correlation - Intro (2:41)
Correlation Feature Selection (5:32)
Correlation procedures to select features (3:37)
Correlation | Notebook demo (11:49)
Basic methods plus Correlation pipeline
Correlation with Feature-engine (8:01)
Feature Selection Pipeline with Feature-engine (2:19)
Additional reading resources
How are we doing?
🎉 Bonus! An eye-opening movie experience! 🍿
Filter methods | Statistical measures
Statistical methods – Intro (3:25)
Mutual information (6:11)
Mutual information demo (4:39)
Chi-square test (16:15)
Chi-square | Demo (5:54)
Chi-square considerations (9:19)
Chi2 - calculating the expected frequencies (optional) (3:51)
Chi-square quiz
Anova (5:54)
Anova | Demo (6:10)
Select features based of p-values (10:32)
Basic methods + Correlation + Filter with stats pipeline
Filter Methods | Other methods and metrics
Filter Methods with other metrics (3:04)
Univariate model performance metrics (5:52)
Univariate model performance metrics | Demo (4:23)
KDD 2009: Select features by target mean encoding (6:39)
KDD 2009: Select features by mean encoding | Demo (6:59)
Univariate model performance with Feature-engine (4:54)
Target Mean Encoding Selection with Feature-engine (5:20)
How are we doing?
🔥 Unveiling the Dark Side of Algorithms: A Captivating Book Recommendation!
Wrapper methods
Wrapper methods – Intro (6:39)
MLXtend
Step forward feature selection (3:14)
SFS - MLXtend vs Sklearn (4:06)
Step forward feature selection | MLXtend (6:00)
Step forward feature selection | sklearn
Step backward feature selection (3:13)
Step backward feature selection | MLXtend (5:50)
Step backward feature selection | Sklearn
Exhaustive search (2:45)
Exhaustive search | Demo (3:37)
How are we doing?
Embedded methods | Linear models
Regression Coefficients – Intro (4:05)
Selection by Logistic Regression Coefficients (6:41)
Selection by Linear Regression Coefficients (2:44)
Coefficients change with penalty (5:26)
Basic methods + Correlation + Embedded method using coefficients
Embedded methods – Lasso regularisation
Regularisation – Intro (5:39)
Lasso (6:39)
A note on SelectFromModel
Basic filter methods + LASSO pipeline
Embedded methods | Trees
Feature Selection by Tree importance | Intro (6:46)
Feature Selection by Tree importance | Demo (3:40)
Feature Selection by Tree importance | Recursively (5:04)
Feature selection with decision trees | review
Hybrid feature selection methods
Introduction to hybrid methods (1:50)
Feature Shuffling - Intro (2:41)
Shuffling features | Demo (8:41)
Recursive feature elimination - Intro (2:21)
Recursive feature elimination | Demo (5:42)
Recursive feature addition - Intro (2:06)
Recursive feature addition | Demo (2:55)
Feature Shuffling with Feature-engine (5:39)
Recursive feature elimination with Feature-engine (4:53)
Recursive feature addition with Feature-engine (3:22)
Final section | Next steps
Additional reading resources
Congratulations
Next steps
How did we do?
Anova | Demo
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock