When Should You Use L1/L2 Regularization

Опубликовано: 01 Ноябрь 2022
на канале: Mısra Turp
7,209
197

Overfitting is one of the main problems we face when building neural networks. Before jumping into trying out fixes for over or underfitting, it is important to understand what it means, why it happens and what problems it causes for our neural networks. In this video, we will look into L1 and L2 regularization. We will learn how these regularization techniques work and when to use them.

Previous lesson:    • Why Regularization Lowers Overfitting  
Next lesson:    • What is Dropout Regularization | How ...  

📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 https://misraturp.gumroad.com/l/fdl

📕 NNs hyperparameters cheat sheet: https://www.soyouwanttobeadatascienti...

👩‍💻 You can get access to all the code I develop in this course here: https://github.com/misraturp/Deep-lea...

❓To get the most out of the course, don't forget to answer the end of module questions:
https://fishy-dessert-4fc.notion.site...

👉 You can find the answers here:
https://fishy-dessert-4fc.notion.site...

RESOURCES:
🏃‍♀️ Data Science Kick-starter mini-course: https://www.misraturp.com/courses/dat...
🐼 Pandas cheat sheet: https://misraturp.gumroad.com/l/pandascs
📥 Streamlit template (updated in 2023, now for $5): https://misraturp.gumroad.com/l/stemp
📝 NNs hyperparameters cheat sheet: https://www.misraturp.com/nn-hyperpar...
📙 Fundamentals of Deep Learning in 25 pages: https://misraturp.gumroad.com/l/fdl

COURSES:
👩‍💻 Hands-on Data Science: Complete your first portfolio project: https://www.misraturp.com/hods

🌎 Website - https://misraturp.com/
🐥 Twitter -   / misraturp