(Day 25) Using neural nets for time series predictions

Ivan Ivanov · January 26, 2024

Hello :) Today is Day 25!

A quick summary of today:

It has 8 videos

image image

Today I managed to watch 4 of them. WOW amazing!

Some of the content is below

Firstly, I learned a bit about forward and backward propagation.

image

Simply put, forward prop is prediction and loss calculation, and backprop is through calculus(chain rule) the gradient for every neuron is calculated, and then with SGD, Adam or another optimizer, we do optimization.

Then, a simple character text prediction model was constructed and it is similar to DeepLearning.AI’s text generation course.

However In the fourth video, was about a NNs loss function. Andrej said that the loss visualization should not be similar to the hockey-stick (he said it was not good). I think the loss of the model I did yesterday was like that.

image

So, I decided to change the model a bit, and Andrej said that it is a good solution to put the bias directly on the last layer.

image

Resulting loss graph:

image

And test prediction:

image

It still looks like a hockey-stick a bit, but it is definitely better than before. Definitely need to rewatch today’s videos in the future.


That is all for today!

See you tomorrow :)

Original post in Korean