1
This is my training function:
model.fit(train_states, train_plays, epochs= numEpochs,
validation_data = (test_states,test_plays),
shuffle=True)
When I don’t define Steps per epoch, I get this:
Train on 78800 samples, validate on 33780 samples
Epoch 1/100
32/78800 [..............................] - ETA: 6:37 - loss: 4.8805 - acc: 0.0000e+00
640/78800 [..............................] - ETA: 26s - loss: 4.1140 - acc: 0.0844
1280/78800 [..............................] - ETA: 16s - loss: 3.7132 - acc: 0.1172
1920/78800 [..............................] - ETA: 12s - loss: 3.5422 - acc: 0.1354
2560/78800 [..............................] - ETA: 11s - loss: 3.4102 - acc: 0.1582
3200/78800 [>.............................] - ETA: 10s - loss: 3.3105 - acc: 0.1681
3840/78800 [>.............................] - ETA: 9s - loss: 3.2102 - acc: 0.1867
...
But when I need to define:
model.fit(train_states, train_plays, epochs= numEpochs,
validation_data = (test_states,test_plays),
steps_per_epoch=78800,
validation_steps=33780,
shuffle=True)
The training time for each season increases absurdly, even though it is 78800:
Epoch 1/100
1/78800 [..............................] - ETA: 35:39:24 - loss: 4.8044 - acc: 0.0172
2/78800 [..............................] - ETA: 34:48:03 - loss: 4.7417 - acc: 0.0114
3/78800 [..............................] - ETA: 34:04:17 - loss: 4.6801 - acc: 0.0369
4/78800 [..............................] - ETA: 33:59:25 - loss: 4.6148 - acc: 0.0528
5/78800 [..............................] - ETA: 33:47:50 - loss: 4.5438 - acc: 0.0622
and even if I define batch_size, it cotinua going from one to one
So I need help to understand what’s going on and what the solution would be
I am using Keras
This is the model if necessary:
model = keras.Sequential([
keras.layers.Flatten(input_shape=(8, 4)),
keras.layers.Dense(300, activation=tf.nn.relu),
keras.layers.Dense(300, activation=tf.nn.relu),
keras.layers.Dense(300, activation=tf.nn.relu),
keras.layers.Dense(128, activation=tf.nn.softmax)
])
model.compile(optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy'])