top of page
Writer's pictureH-Barrio

Cats and Dogs in the Stock Market (III)

Now that we have the data ready, the next step is to feed it into a prototype convolutional neural network to check the feasibility of the approach. As the predictions may be ineffective, it is usually better to generate a small, well-known model with very short training times to discard any possible defects in the approach early on. If a simple convolutional neural network cannot achieve good results, a number in the range of 80% accuracy in the prediction for the test data, for example, it will be very difficult for a more complex network to perform well. Using this approach, a lack of significant patterns in the data or a lack of data itself can be detected very early.


To complete these first predictions using Quantconnect´s research environment, we will be using MMM ticker (a dividend champion) history for the past 10 years:

#History to train the model on:
n_years = 10
trading_days_per_year = 220
trading_minutes_per_day = 390
minutes_to_request = int(n_years*trading_days_per_year*trading_minutes_per_day)
days_to_request = int(minutes_to_request / trading_minutes_per_day)
history = qb.History(qb.Securities.Keys, minutes_to_request, Resolution.Minute)

With these lines of code we obtain two pandas dataframes, one with the minute data for the past 10 years and one with the daily data for a similar time period. Note that trading days before market holidays are shorter and do not contain 390 minutes of data, so we will eliminate those truncated days from our analysis and then we will skip them during the back-testing of the strategy and even during live deployment if the model proves effective. Be sure to check NYSE operating hours for an overview of holiday and pre-holiday operation, these constitute important exceptions to be managed by any automated trading system.


At this point the minute data contains 858000 entries:

<class 'pandas.core.frame.DataFrame'>
MultiIndex: 858000 entries, (MMM R735QTJ8XC9X, 2011-08-24 11:31:00) to (MMM R735QTJ8XC9X, 2020-06-05 16:00:00)
dtypes: float64(15)
memory usage: 108.8+ MB

These values can be managed without any problem in the research environment.


Initially this data will be split into training and testing sets manually, sending into testing the last 20% of the days we have data for. It could be a better approach to use sklearn train/test split in this case, as the set of data samples does not constitute a time series in itself. By taking the latest 20% of days as the test set we are making it harder and better for the validation, as the results could be more useful once the strategy is deployed having the strategy tested with more recent days.

#Data container: generate a data list to hold date, symbol, image #information and target label. Then this can be transformed into a #numpy array. Note that the "date_symbol" element of the tuple will be #not homogeneus and is used only for inspection purposes.
data_list = []
for index in scaled_data.index:
    date_symbol = index    
    #The "image" data is transposed to read time along the x-asix, and             
    #truncated by the last 30 minutes:
    image = scaled_data.loc[index]["Image"].transpose()[:,:-30]
    label = daily_class.loc[index]["Class"]
    data_list.append([date_symbol,image,label])
data_list = np.array(data_list)

The results are familiar to us now, a bearish day for MMM with price peaks in blue coincident with volume highs around minute 100 of trading :



And a bullish day for MMM with price highs spread in the mid-day trading minutes with some checkered patterns in the bid and ask channels:



Hopefully our convolutional neural network can tell the cats from the dogs in these 2200 images. In any case the simple neural network will be:


####Create simple convolutiona neural network with keras:
from keras.models import Sequential
from keras.layers import Dense, Conv2D, Flatten, AveragePooling2D
#Le-net 5 (for example):
#create model
model = Sequential()
model.add(Conv2D(filters=64, kernel_size=(3, 3), activation='relu', input_shape=(15,360,1)))
model.add(AveragePooling2D())
model.add(Conv2D(filters=32, kernel_size=(3, 3), activation='relu'))
model.add(AveragePooling2D())
model.add(Flatten())
model.add(Dense(units=120, activation='relu'))
model.add(Dense(units=84, activation='relu'))
model.add(Dense(units=2, activation = 'softmax'))

This will be a very basic LeNet, with convolutions and kernels selected for speed rather than high accuracy. Of course more layers, filters and kernel sizes can be used and experimented on once we have initial network results although the work may yield only minor improvements.


Training for 10 epochs, the model seems to be able to predict with an acceptable level of accuracy the category of the day:


Train on 1744 samples, validate on 435 samples
Epoch 1/10
1744/1744 [==============================] - 42s 24ms/step - loss: 0.2792 - accuracy: 0.8916 - val_loss: 0.3065 - val_accuracy: 0.8782
Epoch 2/10
1744/1744 [==============================] - 43s 25ms/step - loss: 0.2613 - accuracy: 0.8911 - val_loss: 0.3022 - val_accuracy: 0.8621
Epoch 3/10
1744/1744 [==============================] - 44s 25ms/step - loss: 0.2327 - accuracy: 0.9048 - val_loss: 0.3499 - val_accuracy: 0.8805
Epoch 4/10
1744/1744 [==============================] - 43s 24ms/step - loss: 0.2114 - accuracy: 0.9037 - val_loss: 0.3078 - val_accuracy: 0.8805
Epoch 5/10
1744/1744 [==============================] - 41s 24ms/step - loss: 0.2083 - accuracy: 0.9151 - val_loss: 0.4589 - val_accuracy: 0.8759
Epoch 6/10
1744/1744 [==============================] - 44s 25ms/step - loss: 0.2071 - accuracy: 0.9128 - val_loss: 0.3430 - val_accuracy: 0.8759
Epoch 7/10
1744/1744 [==============================] - 42s 24ms/step - loss: 0.1811 - accuracy: 0.9220 - val_loss: 0.4283 - val_accuracy: 0.8529
Epoch 8/10
1744/1744 [==============================] - 45s 26ms/step - loss: 0.1657 - accuracy: 0.9289 - val_loss: 0.3636 - val_accuracy: 0.8736
Epoch 9/10
1744/1744 [==============================] - 43s 24ms/step - loss: 0.1428 - accuracy: 0.9432 - val_loss: 0.4239 - val_accuracy: 0.8644
Epoch 10/10
1744/1744 [==============================] - 44s 25ms/step - loss: 0.1199 - accuracy: 0.9507 - val_loss: 0.4139 - val_accuracy: 0.8805

The model seems a good starting point to test on real market data by first checking that the directionality of the market prediction is correct and then trying to find a profitable use for these predictions. Take into account that what we know with the prediction for the first 360 minutes of the day is whether the closing price will be above or below the opening price, so a lot of these predictions will have very little real value. The predictions that will have value will point us to a last leg market reversal that could be exploited for gains. The implementation of the strategy using these predictions will be the subject of our next publication.




25 views0 comments

Recent Posts

See All

Comments


bottom of page