Hello :) Today is Day 1! My adventure begins!
Today, I decided to review some machine learning knowledge, and I watched a machine learning lecture conducted by Professor Yaser Abu-Mostafa of the California Institute of Technology. The course includes 18 lectures, and the professor is Yaser Abu-Mostafa. The lectures were originally held in 2012, but I think it will be helpful to learn basic knowledge and basic concepts of machine learning. I randomly came across them, and I liked the professor from the first lecture, so I kept watching. The lectures are very interesting, so I studied all the lectures from No. 1 to No. 11 today and I think I will finish it tomorrow.
Each lecture was filled with different math functions and math explanations, and the lecturer explained it all very well, step by step, and looking back at it, it seems complicated in the end, but during the lecture the explanations were very clear.
The content was like below:
- The learning problem
- Can we learn?
- Regression model
- Error and noise
- Training vs testing
- Generalization theory
- The VC dimension
- Regression model (2)
If you look at linear models, you can think of them as an economic car. They get you where you want to go and they dont consume a lot of gas. You may not be very proud of them, but they actually do the job. It is remarkable how often they succeed on their own and they are sufficient to get you the learning performance that you want. Try to use them first and see if they achieve what you want.
- an interesting quote from the professor. As I am learning, it might be useful to always at the start, just use a simple linear model for whatever I am doing, just to see its performance.
- Neural networks
- Overfitting
I’m really looking forward to studying for the remaining lectures! The picturea above are included in the lecture videos, but rather than doing that, it would be good to take notes to post something like this.
That is all for today!
See you tomorrow :)