Machine Learning Algorithms: Supervised Learning Tip to Tail
Alberta Machine Intelligence Institute
There are many other courses that teach you some aspects of supervised learning, but this one gives you the big picture. Classification and Regression are closely related, it’s likely to get lost in the details.
The course begins with the Decision Trees and k-Nearest Neighbor. These are the techniques that are easy to understand instinctively, partly because we all have seen trees and we all have neighbors. These are just starter dishes. We want to do something more interesting than just classifying things.
Line fitting and L2 Loss of Linear Regression is the first main course served. This amounts to narrowing down our hypothesis space to one particular hypothesis or model that best makes predictions on unseen data. Following this path, the L1 Loss, Convexity and another important iterative method “Gradient Descent” are introduced.
Obviously, linear model is too simple to describe this world. Nonlinear features will take you to the next level, however increased model complexity also means more problems to tackle: Under-fitting, over-fitting, bias-variance tradeoff. From here on, not only the loss functions are on the table, but also are regularizers. You need to minimize both of them.
“Logistic Regression is a classification.” Think again, this is a fine example why Classification and Regression are closely related. Logistic regression can actually be implemented with neural networks. Another related classification method called Support Vector Machine definitely has its own merits, with a special Loss function called Hinge Loss. You see, they are all related.
Final module is about the assessment of regression and classification models. Now the teaching of measures (e.g. MSE, MAE, Accuracy, Recall, ROC, etc), validation and hyper-parameter tuning just come in the right place at right time. The knowledge in previous modules prepares a pretty good context for this module.
This is your all-in-one course for supervised learning. Amazing!
I am Kesler Zhu, thank you for visiting my website. Checkout more course reviews at https://KZHU.ai
All of your support will be used for maintenance of this site and more great content. I am humbled and grateful for your generosity. Thank you!
Don't forget to sign up newsletter, don't miss any chance to learn.
Or share what you've learned with friends!Tweet