I’m trying to train an LSTM classifier in TensorFlow. Here is a reproducible example
targets = np.array([1, 0, 1, 1, 0, 0]) features = np.arange(6, 2, 1) model = tf.keras.Sequential([ tf.keras.layers.LSTM(64), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile( loss=tf.keras.losses.BinaryCrossentropy(from_logits=True), optimizer=tf.keras.optimizers.Adam(learning_rate=0.001), metrics=(['BinaryAccuracy']) ) history = (model.fit( features, targets, epochs=5, verbose = 1) )
Using BinaryAccuracy
:
Epoch 1/5 1/1 [==============================] - 1s 1s/step - loss: 0.6788 - binary_accuracy: 0.5000
Using Accuracy
:
Epoch 1/5 1/1 [==============================] - 1s 1s/step - loss: 0.6794 - accuracy: 0.0000e+00
I have used the ‘Accuracy’ metric for binary classification before, can someone explain why this is happening?
Advertisement
Answer
The metric is ‘accuracy’, not ‘Accuracy’.