Learnings from FastAI Lesson 1
Original Source Here
Learnings from FastAI Lesson 1
In my effort to grab a Co-op, I managed to set up a brief 30 min meeting with an AI/ML-focused founder. I really wanted to know what I could do to make myself a strong candidate. One of the first things he mentioned was FastAI. And so started my FastAI learning journey.
So, What did I learn from my First Lesson?
a) Origination of Machine Learning:
Contrary to popular belief, Machine Learning is a relatively old topic. “It goes back to 1943 when Warren McCulloch and Walter Pitts teamed up to develop a mathematical model of a neuron.” (Howard & Gugger, 2018) They were able to “represent a simplified model of a neuron using simple addition and thresholding”(Howard & Gugger, 2018). Frank Rosenblatt, a psychologist from Cornell “further developer the artificial neuron to give it the ability to learn.”(Howard & Gugger, 2018) His contribution was especially influential as he worked on building the first device using Machine Learning principles called the Mark 1 Perceptron in 1957. The Perceptron was successfully able to recognize simple shapes.
b) A new approach to Learning:
The teaching approach in FastAI is a bit unconventional. Instead of learning the fundamentals of the topic thoroughly and then practice these fundamentals like most courses, it focuses on practicing the concepts first and then learning about how they work. This is based on Harvard professor, David Perkins, “Making Learning Whole” and a paper written by Paul Lockhart, a Columbia math Ph.D. named “A Mathematicians Lament”. The basic idea is to learn the game by playing it first and using these play sessions to teach the game to yourself rather than deep diving into the intricacies of how the game operates.
c) What is Machine Learning?
Machine Learning is a discipline with various specialized areas such as Deep Learning, Supervised Learning, and Unsupervised learning among others. “Machine learning operates on principles very similar to a normal computer program” (Howard & Gugger, 2018), input -> Function -> Output. The key difference is that Machine Learning lets computers figure out how to solve a problem themselves rather than relying on human inputs or actions. As per my rough understanding, the program has weights or parameters, a result, and a function that checks how accurate the result is comparing it with labels in the dataset. Then, depending on the accuracy of the result vs label match, update the parameters and repeat the process until a decent accuracy level is reached.
d) What is Overfitting
Computers are smart machines but still machines at the end of the day. A good machine learning practice is to always have at least two sets of data, one for training and the other for validating the results. Primarily because a model may end up memorizing results instead of actually “generalizing.” We thus need unseen data to check if the model is working how we want it to with new data. The phenomenon where the model starts memorizing results is also known as overfitting. When checking the results of an overfit model, the “validation set accuracy will improve for a while, but eventually, start getting worse.”(Howard & Gugger, 2018)
e) Defining Test Sets
knowing that we pretty much need at least two different data sets, for training and validation, knowing how to define a validation set becomes crucial. It is important to keep in mind that a validation set must be “representative of the new data you will see in the future.”(Howard & Gugger, 2018) For instance, for a time-series problem, choosing a random time period for the data set would not be advisable as you would want the test set to closely match the most recent data and not something 5 years old. We also have to keep in mind our own biases, while training a model, knowing what to expect in the training set, we may model our program with certain biases which can cause inaccurate results.
In conclusion, each lesson is really in-depth and I would advise anyone who is learning through FastAI to really take their time to soak it all in. I had to come back to lesson 1 after starting lesson 2 as I felt like I am missing out on some context. Stay tuned for my learnings from Lesson 1. There were a ton of other learnings that I would probably collate for another in-depth article. Cheers
AI/ML
Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot
via WordPress https://ramseyelbasheer.io/2021/05/31/learnings-from-fastai-lesson-1/