-1

I am trying to predict google stock price using LSTM model from PyTorch.
However after training my model and plotting the predicted results vs the real value, I see periodic sharp downward spikes.
Here is notebook I have wrote in kaggle : Notebook in kaggle

To summaries the steps I have done for training:

  1. I considered to predict only one feature of the Google stock price, i.e. the Open feature.

  2. I split the data into training set and test set (80% for training and 20% for test)

  3. I used MinMaxScaler() method from sklearn, to fit scaler based on only training data, and transformed (scaled) the training and test data.

  4. I considered sequence length of 10.

  5. For LSTM model, I considered the input_size = 1, the hidden_size = 64, and num_layers=2.

    self.lstm = nn.LSTM(1, hidden_size, num_layers)
    self.fc = nn.Linear(hidden_size, 1)
    
  6. I considered batch_size = 64, and I did not shuffle (because for training time series data, I learned we don't shuffle).

  7. I defined the learning_rate = 0.001, the epochs = 3, used torch.optim.Adam for optimization, and nn.MSELoss() for loss function.

  8. After training, when I plot the real value of test data (in blue) and the predicted value (purple), I see periodic sharp downward spikes

    Here is a picture of the comparison between predicted value and real valueenter image description here

I don't know why this happen, I tried to make batch_size =1 and tried also training with different sequence length rather than 10 (different window size), but I saw still the periodic sharp downward spikes.

I am new to Deep learning and learned recently the LSTM, and I'm not sure why this happen or how to fix it

1 Answer 1

1

I finally understand the issue.
I increased the number of epochs to 100.
and the results is much better.
I had no idea about how much the number of epochs should be.
here is the results by increasing epochs to 100.
enter image description here

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.