Stock Market Predictor

Knowledge Square
0
Prediction is one of the main reasons why we are using Neural Network today we are going to talk about how to predict Stock market prices.
For this project for model the neural network use the Keras library if you want to know more about Keras this is the official website https://keras.io .Basically Keras is python Deep Learning library which have implemented the nural network concepts to algorithms. To insatall keras on windows use Command prompt as administrator and run the following code.
sudo pip install keras
If there is error when installing keras on your system go to this GIT repository and download the repository
https://github.com/fchollet/keras.git
and go to the directory using command prompt and install using following command
python setup.py install
after that your environment is ready to create a neural model after that create a new python file and import following libraries
from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
import lstm,time
initially we need a data set to train on for the data set we use stock market data from https://fred.stlouisfed.org/series/SP500 data set is bound to 2000 jan to 2016 aug
after that you have to created the training model for that we use recurrent neural network because the data we used to train is associated with time series data.
With using Keras and tensorflow a recurrent neural network has been created.
For the first step we are defining and creating the Neural Network model
lets say the model name is model model type has to be sequential
model = Sequential()
After that the feature of the model has to be defined, In the RN Network LSTM (Long-Short Term Memory) is the model that can be used to manipulate data in a time series.
model.add(LSTM(
input_dim=1,
output_dim=1,
return_sequences=True))
in this LSTM model input_dim variable contains input dimension and output _dim define the Out Put dimension of the neural network assigning the return_sequences to true to return the last output. in the output sequence, or the full sequence.
model.add(Dropout(0.2))
this will add a dropout to avoid the over fitting.
model.add(Dense(output_dim=1))
regular densely-connected NN layer
after that neural network has to have a activation step to add this insert the following code
model.add(Activation('linear'))
activation function is called linear function we have talk those things earlier introduction to neural networking blog (https://amilaco2.blogspot.com/2017/09/neural-network.html)
afger that the model has to be trained and to start the learning process we need a loss function and optimizer for accurate results
model.compile(loss='mse', optimizer='rmsprop')
loss function is Mean Square Error and optimizer is called rmsprop for further studies reffer this documentation by university of toronto rmsprop
after that model has to fit to a certain amount of steps and certain amount of training batch of data has to be defined all those things are done by using fit function in the keras
model.fit(X_train,y_train,batch_size=512,nb_epoch=1,validation_split=0.05)
  • X_train: input data, as a Numpy array or list of Numpy arrays (if the model has multiple inputs). 
  • Y_train: labels, as a Numpy array. 
  • batch_size: integer. Number of samples per gradient update. 
  • nb_epochs: integer. Number of epochs to train the model.
  • validation_split: float (0. < x < 1). Fraction of the data to use as held-out validation data.
after these functions modeling of the neural network is done.Training data is used as a csv file and feed these data through this deceleration
X_train, y_train, X_test, y_test = lstm.load_data('sp500.csv', 50, True)
now the entire code looks like this
from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
import lstm, time #helper libraries

#Step 1 Load Data
X_train, y_train, X_test, y_test = lstm.load_data('sp500.csv', 50, True)

#Step 2 Build Model
model = Sequential()

model.add(LSTM(
input_dim=1,
output_dim=50,
return_sequences=True))
model.add(Dropout(0.2))

model.add(LSTM(
100,
return_sequences=False))
model.add(Dropout(0.2))

model.add(Dense(
output_dim=1))
model.add(Activation('linear'))

start = time.time()
model.compile(loss='mse', optimizer='rmsprop')
print ('compilation time : ', time.time() - start)

model.fit(
X_train,
y_train,
batch_size=512,
nb_epoch=1,
validation_split=0.05)
in the lstm python file there are two functions to make predict and load the data to load the data use the below function.
def load_data(filename, seq_len, normalise_window):
f = open(filename, 'r').read()
data = f.split('\n')

sequence_length = seq_len + 1
result = []
for index in range(len(data) - sequence_length):
result.append(data[index: index + sequence_length])

if normalise_window:
result = normalise_windows(result)

result = np.array(result)

row = round(0.9 * result.shape[0])
train = result[:int(row), :]
np.random.shuffle(train)
x_train = train[:, :-1]
y_train = train[:, -1]
x_test = result[int(row):, :-1]
y_test = result[int(row):, -1]

x_train = np.reshape(x_train, (x_train.shape[0], x_train.shape[1], 1))
x_test = np.reshape(x_test, (x_test.shape[0], x_test.shape[1], 1))
#print(x_train.shape[0])
return [x_train, y_train, x_test, y_test]
after that the prediction is going on for this below code has to be in the lstm python file
def predict_point_by_point(model, data):
#Predict each timestep given the last sequence of true data, in effect only predicting 1 step ahead each time
predicted = model.predict(data)
predicted = np.reshape(predicted, (predicted.size,))
return predicted
to display the prediction in the main python file insert the below code snippet
predictions = lstm.predict_point_by_point(model, X_test)
print(predictions)
after compiling the code i got following results.
Below shows the virtualization of actual and predicted data figure

Predicted Data set
compilation time :  0.026500225067138672
Train on 3523 samples, validate on 186 samples
Epoch 1/1
3523/3523 [==============================] - 3s - loss: 0.0054 - val_loss: 0.0072
[ 5.85206300e-02 5.80762923e-02 4.60842885e-02 4.57523428e-02
5.80612905e-02 3.80649194e-02 5.60437329e-02 6.55997619e-02
8.10309872e-02 7.95803145e-02 8.83336142e-02 8.96523222e-02
7.91760981e-02 7.25092664e-02 5.56853116e-02 6.45641014e-02
5.36803193e-02 4.69004363e-02 4.77261059e-02 3.52381952e-02
3.56821008e-02 2.85532586e-02 1.59102809e-02 1.45966699e-02
1.55791743e-02 8.07799585e-03 2.54244683e-03 6.33677817e-04
-3.37703899e-03 -4.42016404e-03 -3.55060771e-03 -3.67428828e-03
-3.73386638e-03 -4.49962867e-03 -9.94808320e-03 -9.13602114e-03
-1.15186553e-02 -1.67963076e-02 -1.93202998e-02 -1.75452735e-02
-1.93856936e-02 -1.57473329e-02 -7.66137894e-03 -1.20956777e-02
-1.36141768e-02 -1.00281471e-02 -8.98396783e-03 8.14348692e-04
3.79954465e-03 2.26022396e-02 2.08664779e-02 3.93857770e-02
4.79650386e-02 5.83524145e-02 4.05373052e-02 1.89058986e-02
1.56382266e-02 1.27549414e-02 1.13293109e-02 1.12366378e-02
7.07422616e-03 4.59136209e-03 7.39267189e-03 1.53517192e-02
1.36143398e-02 2.99531706e-02 3.74512561e-02 2.52813119e-02
7.65680568e-03 1.62337422e-02 2.47933287e-02 2.78014112e-02
3.36666144e-02 4.25905064e-02 2.92374399e-02 2.73592789e-02
2.22185254e-02 6.76064845e-03 1.16347577e-02 8.71812552e-03
2.17021424e-02 3.50503363e-02 2.60843430e-02 3.95039357e-02
2.76486613e-02 1.45830316e-02 1.97967160e-02 1.08113280e-02
1.51207438e-02 2.01653559e-02 1.06489826e-02 1.16093224e-02
3.26074567e-03 3.37306294e-04 -1.74907851e-04 1.09016069e-03
2.77880486e-03 -2.72525940e-03 -2.03110417e-03 -4.52482793e-03
-3.91075667e-03 -2.87036737e-03 -3.39614693e-04 -6.57092780e-03
-2.33903993e-03 1.77314575e-03 6.09506620e-04 1.48530109e-02
1.15610352e-02 2.91399341e-02 3.19958143e-02 2.08242554e-02
2.79197972e-02 1.56756267e-02 1.96898598e-02 8.48825369e-03
1.36623001e-02 5.06879110e-03 6.76918216e-03 1.26717715e-02
2.67133154e-02 2.84648389e-02 2.52280980e-02 1.20757753e-02
1.95195619e-02 2.23723315e-02 1.79926958e-02 1.06305992e-02
1.18402774e-02 8.54633935e-03 3.86160240e-03 -1.24662684e-03
3.54646659e-03 2.56137364e-03 -1.73314649e-03 -3.05957859e-04
1.13406936e-02 2.34414660e-03 3.19496728e-03 -2.76085781e-03
-6.28816150e-03 -9.88355558e-03 -7.22114090e-03 -1.14918714e-02
-9.57921613e-03 -1.27187010e-03 -1.31206298e-02 -1.65861044e-02
-4.95716464e-03 5.39505854e-05 -2.40000803e-03 -1.37382764e-02
-6.84996415e-03 -2.02878844e-03 5.98398037e-06 -9.19318479e-03
-9.38253384e-03 -1.22446241e-02 -1.16759166e-02 -1.08230682e-02
-1.31203439e-02 -1.09775336e-02 -9.96864750e-04 -1.00363279e-02
-9.07459110e-03 -3.37548787e-03 -5.86470123e-03 -5.39659988e-03
-8.01997911e-03 -1.54497917e-04 6.90601824e-04 6.66039856e-03
6.02910807e-03 -5.89884445e-03 -8.25538579e-03 -3.18671390e-03
-2.38256948e-03 -1.38638103e-02 -2.29315013e-02 -3.96574363e-02
-4.09728996e-02 -5.25074936e-02 -5.84387220e-02 -5.62115014e-02
-5.76485321e-02 -6.12186901e-02 -4.41707782e-02 -4.94348556e-02
-5.82008921e-02 -5.94116971e-02 -5.67227714e-02 -6.29391372e-02
-4.65299524e-02 -4.77623865e-02 -5.83724119e-02 -6.76979795e-02
-7.10541308e-02 -6.99431375e-02 -7.75736645e-02 -7.91585669e-02
-8.12295675e-02 -7.93554857e-02 -7.95655623e-02 -7.66119957e-02
-6.83586895e-02 -6.38552979e-02 -7.55809471e-02 -8.15170556e-02
-7.96390101e-02 -7.47972205e-02 -6.89146817e-02 -6.32879809e-02
-6.28640950e-02 -5.21514378e-02 -4.60681729e-02 -5.48939146e-02
-4.25300039e-02 -4.05350961e-02 -3.65146510e-02 -3.73264588e-02
-3.92325185e-02 -3.35991420e-02 -2.23848205e-02 1.59885804e-03
3.63612585e-02 7.86460117e-02 9.49523523e-02 6.10525273e-02
4.06132340e-02 4.26813215e-02 5.30785806e-02 8.39086920e-02
6.74437732e-02 6.66984096e-02 8.12155679e-02 5.59004620e-02
6.80622533e-02 6.14463054e-02 5.58639094e-02 5.90972118e-02
4.62020449e-02 3.76756489e-02 4.04753424e-02 5.69567271e-02
5.30761741e-02 6.59405068e-02 6.86741844e-02 7.23501220e-02
7.30683729e-02 9.84467566e-02 9.72130969e-02 7.84919485e-02
7.58437291e-02 6.06382415e-02 4.09985334e-02 4.24704589e-02
3.27935852e-02 2.25617290e-02 2.01162845e-02 1.70091446e-02
2.19881795e-02 2.53102425e-02 1.00110369e-02 5.18528512e-03
5.10945311e-03 7.07349181e-03 1.34430639e-02 -2.38931878e-03
-1.32590299e-02 -1.22338440e-02 -1.16434805e-02 -2.62566172e-02
-3.00389249e-02 -3.01891975e-02 -4.71237116e-02 -5.54527938e-02
-5.79890274e-02 -6.29814938e-02 -6.88634366e-02 -6.55703396e-02
-7.25508258e-02 -7.43920580e-02 -6.50478527e-02 -5.76698408e-02
-7.45579302e-02 -7.49447942e-02 -9.06181335e-02 -8.94655958e-02
-9.25291181e-02 -9.06127468e-02 -9.11563188e-02 -9.10307467e-02
-9.22842398e-02 -8.92632902e-02 -1.01178080e-01 -9.31360498e-02
-8.14918652e-02 -1.01602532e-01 -9.52833369e-02 -8.86486322e-02
-7.99728706e-02 -8.01991001e-02 -5.94246797e-02 -6.13645501e-02
-6.85663968e-02 -7.93857202e-02 -6.21280409e-02 -4.17139344e-02
-4.55273204e-02 -5.00627346e-02 -5.78384325e-02 -5.25601991e-02
-4.70313355e-02 -5.40368706e-02 -4.41580005e-02 -3.21475752e-02
-1.44059910e-02 -1.35666216e-02 2.06573354e-03 2.83502638e-02
4.19557914e-02 4.40768078e-02 3.92342620e-02 6.62740022e-02
5.21989465e-02 7.52765164e-02 7.65144229e-02 8.95897821e-02
8.62886906e-02 6.84933737e-02 8.52507278e-02 7.29163960e-02
8.45021531e-02 7.98174813e-02 5.63019998e-02 5.69224618e-02
7.57213533e-02 7.14411214e-02 7.07527623e-02 8.99449214e-02
1.05029181e-01 1.07247449e-01 1.09095953e-01 1.22507848e-01
1.05588041e-01 9.11810026e-02 7.65597150e-02 8.20273608e-02
8.25656205e-02 6.88135773e-02 8.08828622e-02 7.60241374e-02
6.40615895e-02 6.46933988e-02 7.13570938e-02 4.71040308e-02
4.21246886e-02 3.78652103e-02 3.37693729e-02 3.21275331e-02
4.24417742e-02 3.66411619e-02 3.55320759e-02 1.85960811e-02
1.89165752e-02 2.00960040e-02 1.44908484e-02 8.44939891e-03
5.06964372e-03 5.43930056e-03 7.92409386e-03 1.60334464e-02
1.83265675e-02 1.97661128e-02 1.31838024e-02 1.09648686e-02
1.49389599e-02 1.05031673e-02 1.48754297e-02 2.55046822e-02
1.53587703e-02 2.68027242e-02 2.33613495e-02 2.51732673e-02
1.48678813e-02 4.27815039e-03 3.64274159e-03 3.92845925e-03
-3.88749829e-03 -8.89603142e-03 -1.16800368e-02 -8.15896317e-03
-9.08250455e-03 -7.60718621e-03 -9.19696689e-03 -1.01451604e-02
1.18799624e-04 6.87137200e-03 1.66873925e-03 1.30385570e-02
2.21063215e-02 2.58232169e-02 2.63174735e-02 2.91121583e-02
2.04436984e-02 3.29892002e-02 3.61012667e-02 4.70443852e-02
3.98504697e-02 5.10278158e-02 5.24579510e-02 5.74467443e-02]
Actual Data set
[ 0.04581839  0.02885663  0.01104222  0.00403093  0.04019949  0.0468995
0.07387159 0.0904503 0.11064464 0.10873938 0.12149329 0.12229705
0.10260451 0.08134931 0.06022826 0.04850267 0.02654783 0.03121281
0.05123794 0.03010504 0.02318517 0.01422806 -0.00335969 -0.01245906
0.00363802 -0.00050401 0.00044803 0.01536963 0.00665276 0.00853567
-0.00426834 -0.01822654 -0.00910372 -0.02269608 -0.01508435 0.00063945
-0.00547558 -0.00047491 -0.00673619 -0.00981603 -0.00204551 0.00046914
0.01706407 0.01472983 0.01253899 0.01339821 0.01063899 0.02426333
0.02419621 0.04409368 0.0385835 0.05414194 0.05773435 0.07332436
0.04714118 0.01809602 0.01467662 -0.00350247 -0.00131593 -0.01811817
-0.02323378 -0.01177675 -0.01295464 0.01082619 0.00781269 0.03905811
0.04327355 0.0405746 0.02050299 0.02283339 0.01616659 0.01637142
0.02473561 0.04695707 0.02400187 0.0183629 0.01714464 0.00846774
0.0119455 0.01206064 0.03036628 0.04989612 0.0352158 0.05055168
0.04244744 0.02680934 0.01943166 0.01836583 0.02034575 0.02991097
0.02143481 0.02376563 0.00978699 0.00847406 0.0030995 -0.00674861
0.00516822 0.00198547 -0.00957498 -0.01670074 -0.01223359 0.00253944
0.00039443 -0.00862844 -0.00441225 0.01075518 0.01032343 0.02797328
0.02327568 0.03996266 0.04439678 0.02909563 0.02473948 0.02032012
0.02242224 0.00375799 0.01075015 0.00071154 0.00458566 0.00207511
0.01541934 0.01124924 0.00928175 0.0090881 0.01981257 0.01671133
0.00845201 0.00753137 0.01161177 0.01889624 0.00899495 0.00989032
0.01518331 0.0060787 -0.00205059 -0.00166272 -0.01131091 -0.01775366
-0.00947419 -0.01479152 -0.02090458 -0.01716486 -0.02951267 -0.03000338
-0.01434843 0.00675618 0.00031301 -0.0033531 0.01666942 0.02234935
0.0192912 0.00146962 0.00418928 0.00144336 -0.0089732 -0.025204
-0.01388777 -0.00968903 -0.00902337 -0.01035351 -0.01538376 -0.01539937
-0.00207198 -0.01879929 -0.02037918 -0.00152319 -0.01309822 -0.01116328
-0.01451237 -0.00205171 0.0045918 0.00848365 -0.0002595 -0.03299923
-0.06542401 -0.09593581 -0.10401882 -0.07431225 -0.05369347 -0.06240218
-0.06531308 -0.0984526 -0.082544 -0.07467114 -0.08613862 -0.06285062
-0.0561808 -0.05371505 -0.05601654 -0.05958744 -0.04382821 -0.04133396
-0.02759596 -0.04547339 -0.05280222 -0.07470952 -0.0806989 -0.0831166
-0.09083038 -0.11514402 -0.11473587 -0.09398782 -0.09002671 -0.07173129
-0.04452665 -0.04242511 -0.04654009 -0.04512063 -0.04445534 -0.04105831
-0.04497059 -0.04733155 -0.03618376 -0.0242134 -0.02113528 -0.03488766
-0.03125141 -0.01607825 -0.00395509 -0.00973451 -0.01738459 -0.00313308
0.00471233 0.02143218 0.0675634 0.11439834 0.12566868 0.08215362
0.05611619 0.04510606 0.05554255 0.08420201 0.04982912 0.03685558
0.06869071 0.04114425 0.07288214 0.06605061 0.06533228 0.06838608
0.05613998 0.04688998 0.05020106 0.06250154 0.06896898 0.07040058
0.05718093 0.08252078 0.07545544 0.09662183 0.08679523 0.06885306
0.04602824 0.03616962 0.02836365 0.04704737 0.02307815 -0.00391372
0.00310687 0.01066193 0.03024425 0.0334714 0.01612761 0.0222566
0.01460425 0.00648519 -0.0031105 -0.01744208 -0.04090784 -0.06184396
-0.06963578 -0.07973787 -0.07213992 -0.09093186 -0.08659969 -0.10875968
-0.10511299 -0.11457524 -0.10966557 -0.08259487 -0.09830333 -0.08258795
-0.0796786 -0.06410158 -0.05501193 -0.05416395 -0.08665376 -0.08106223
-0.08315263 -0.09898448 -0.11282153 -0.11329577 -0.11398927 -0.1208079
-0.11312017 -0.08844875 -0.05991362 -0.08311939 -0.07668978 -0.05722555
-0.06170577 -0.05965702 -0.03014855 -0.03654406 -0.05440908 -0.04569073
-0.02715135 -0.0060582 -0.0104693 -0.01824939 -0.04119093 -0.03480365
-0.03254561 -0.0270262 -0.02118878 -0.01370387 0.00723418 0.01184107
0.02980519 0.05584411 0.0664766 0.05876263 0.0501681 0.07764459
0.06929299 0.09765307 0.09483187 0.11479946 0.10547937 0.07251037
0.10099727 0.07264018 0.08744259 0.07850066 0.0626108 0.07375549
0.09445463 0.08794631 0.09339327 0.11741709 0.13432319 0.12918083
0.12944828 0.14144274 0.12168723 0.10528173 0.07732436 0.07689425
0.08533299 0.06058603 0.06758555 0.06261262 0.05402467 0.0567952
0.07874834 0.04352616 0.03909495 0.02669307 0.03333513 0.02270499
0.03454321 0.02552709 0.03153954 0.01278322 0.02793569 0.03701021
0.03101791 0.02865352 0.02311199 0.02326476 0.02705628 0.03064743
0.03608651 0.03685714 0.03119698 0.0249667 0.01763819 0.00302976
0.00444802 0.01287421 0.00548231 0.01435418 0.01741064 0.02297265
0.0115098 0.01483858 -0.02178338 -0.03853933 -0.02781312 -0.01429457
-0.00168369 0.00548414 -0.00144868 0.00571894 0.00296407 0.01658593
0.02955466 0.04204708 0.03411117 0.04864851 0.05393145 0.05669478
0.05183903 0.05553536 0.03875476 0.05355883 0.0505641 0.05988925
0.04834863 0.06000855 0.06151995 0.06411641]

Post a Comment

0Comments
Post a Comment (0)