In this lesson we will simplify the binary Log Loss/Cross Entropy Error Function and break it down to the very basic details.
I'll show you all kinds of illustrated and fun examples and even implement everything with code! 🤩
Before you watch this tutorial 🛑please make sure you watch the first episode in this series🛑 where we discuss the Perceptron model:
https://youtu.be/-KLnurhX-Pg
⭐ Clone the complete code via Wayscript ⭐
https://wayscript.com/script/Vr5xqHfR
⭐ Clone the Perceptron code from the previous episode⭐
https://wayscript.com/script/VlHE0uL8
🤖 Watch my Introduction to AI & Machine Learning 🤖
https://youtu.be/mMc_PIemSnU
➰ Watch my Python "For" loop tutorial ➰
https://youtu.be/dHANJ4l6fwA
****************************
⏰ TIMESTAMPS ⏰
****************************
00:00 - Perceptron Recap
00:33 - Target vs Prediction
01:16 - Error Function
01:52 - Cross Entropy Loss / Log Loss Data Preprocessing
03:19 - Cross Entropy Loss on a single data entry
04:31 - Cross Entropy Loss Properties
05:19 - Cross Entropy Loss on all data entries
06:17 - Coding Cross Entropy Loss with Python
****************************
Sorry guys, no subtitles this time as they would cover substantial parts of the graphics 😥
The beautiful icons used in this video are by:
https://www.flaticon.com
https://www.freepik.com
Thank you so much for watching! ❤