0

I want to build an neural net model where say the data of 100 rows is divided into 5 stacks of 20 rows each, now rather than iterating through it all at once I want to build a neural net based on the first 20 rows of data (ie first stack), then save the model(weights etc) and then pass the next stack (Next 20 rows ie row no 21-40) into the updated model (ie weights updated from previous model) rather and so on. Can someone tell me what such type of neural network is called? I just tried my first neural net yesterday where I iterated all data in batches (which I believe takes place in one epoch rather than multiple).

Following is the Neural Net I made in Python using Keras(Tensorflow backend) can someone suggest me edits to make the following model as per my requirement?

# Create first network with Keras
from keras.models import Sequential
from keras.layers import Dense
import numpy
import pandas as pd
# from sklearn.cross_validation import train_test_split
from sklearn.model_selection import train_test_split

# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)

# load dataset
raw_data = pd.read_excel('Data.xlsx',sep=',')
df = raw_data.iloc[:,0:2]

df = pd.get_dummies(df)
rows,cols = df.shape
output_dim = 7 # No. of Output Dimensions/Categories

#Splitting Data in Training & Testing
X_train,X_test,y_train,y_test = train_test_split(df.iloc[:,0:cols-output_dim],df.iloc[:,cols-output_dim:cols],test_size=0.2,random_state=seed)

X  = X_train.as_matrix()
X_test = X_test.as_matrix()
Y = y_train.as_matrix()
Y_test = y_test.as_matrix()


# create model
model = Sequential()
model.add(Dense(X.shape[1], input_dim=X.shape[1], activation='relu')) #Input Layer
model.add(Dense(X.shape[1], activation='relu')) #Hidden Layer
model.add(Dense(X.shape[1], activation='relu')) #Hidden Layer
model.add(Dense(X.shape[1], activation='relu')) #Hidden Layer
model.add(Dense(output_dim, activation='softmax')) #Output Layer

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

# Fit the model
model.fit(X,Y,nb_epoch=10, validation_data=(X_test,Y_test), batch_size=83,verbose=1)

# evaluate the model
loss, accuracy = model.evaluate(X1, Y1)
print("\nValidation Data [Loss: %.2f, Accuracy: %.2f%%]" % (loss, accuracy*100))

1 Answer 1

3

It sounds like you want to train on your data in minibatches of size 20, and save the model after each minibatch. You don't need to change the shape of your input data for this - a matrix of shape (nb_datapoints, nb_features) works. Make sure to specify batch_size=20 when you call model.fit().

In order to save your model after each minibatch, look into Keras callbacks. You'd need to write your own custom callback, but you can model it after the existing ModelCheckpoint callback - that saves the model after each epoch, so it should be relatively simple to customize it for your needs.

3
  • Yeah you precisely got what I wanted, sure I will try that, although can you please specify why should I take batch_size=10 ? Commented Jan 10, 2017 at 12:36
  • Okay, one more thing in my case I want it to complete all its epoch for one minibatch, update the weights and then proceed with the second minibatch with the updated weights so will the above adujstment take care of it? Commented Jan 10, 2017 at 13:40
  • An epoch is by definition one complete run through all the data. But weights are only updated after each minibatch, so the method above should do what you want.
    – tao_oat
    Commented Jan 10, 2017 at 16:06

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.