Overfitting is one of the main problems we face when building neural networks. Before jumping into trying out fixes for over or underfitting, it is important to understand what it means, why it happens and what problems it causes for our neural networks. In this video, we will see how to implement all the regularization techniques we learned about hands-on. This includes, L1/L2 regularization and how to set up its paramaters, Dropout regularization, Data Augmentation and Early Stoppingg.
Previous lesson: • Regularization with Data Augmentation...
Next lesson: • What is Vanishing/Exploding Gradients...
📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 https://misraturp.gumroad.com/l/fdl
📕 NNs hyperparameters cheat sheet: https://www.soyouwanttobeadatascienti...
👩💻 You can get access to all the code I develop in this course here: https://github.com/misraturp/Deep-lea...
❓To get the most out of the course, don't forget to answer the end of module questions:
https://fishy-dessert-4fc.notion.site...
👉 You can find the answers here:
https://fishy-dessert-4fc.notion.site...
RESOURCES:
🏃♀️ Data Science Kick-starter mini-course: https://www.misraturp.com/courses/dat...
🐼 Pandas cheat sheet: https://misraturp.gumroad.com/l/pandascs
📥 Streamlit template (updated in 2023, now for $5): https://misraturp.gumroad.com/l/stemp
📝 NNs hyperparameters cheat sheet: https://www.misraturp.com/nn-hyperpar...
📙 Fundamentals of Deep Learning in 25 pages: https://misraturp.gumroad.com/l/fdl
COURSES:
👩💻 Hands-on Data Science: Complete your first portfolio project: https://www.misraturp.com/hods
🌎 Website - https://misraturp.com/
🐥 Twitter - / misraturp