I’m trying to train an LSTM classifier in TensorFlow. Here is a reproducible example
JavaScript
x
21
21
1
targets = np.array([1, 0, 1, 1, 0, 0])
2
features = np.arange(6, 2, 1)
3
4
model = tf.keras.Sequential([
5
tf.keras.layers.LSTM(64),
6
tf.keras.layers.Dense(1, activation='sigmoid')
7
])
8
9
model.compile(
10
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
11
optimizer=tf.keras.optimizers.Adam(learning_rate=0.001),
12
metrics=(['BinaryAccuracy'])
13
)
14
15
history = (model.fit(
16
features,
17
targets,
18
epochs=5,
19
verbose = 1)
20
)
21
Using BinaryAccuracy
:
JavaScript
1
2
1
Epoch 1/5 1/1 [==============================] - 1s 1s/step - loss: 0.6788 - binary_accuracy: 0.5000
2
Using Accuracy
:
JavaScript
1
2
1
Epoch 1/5 1/1 [==============================] - 1s 1s/step - loss: 0.6794 - accuracy: 0.0000e+00
2
I have used the ‘Accuracy’ metric for binary classification before, can someone explain why this is happening?
Advertisement
Answer
The metric is ‘accuracy’, not ‘Accuracy’.