I’m trying to run the following code but I got an error. Did I miss something in the codes?
from keras.layers.core import Dense, Activation, Dropout from keras.layers.recurrent import LSTM from keras.models import Sequential from keras.callbacks import ModelCheckpoint from keras.models import load_model from keras.optimizers import Adam from keras.regularizers import l2 from keras.activations import relu, elu, linear, sigmoid def build_fc_model(layers): fc_model = Sequential() for i in range(len(layers)-1): fc_model.add( Dense(layers[i],layers[i+1]) )#, W_regularizer=l2(0.1)) ) fc_model.add( Dropout(0.5) ) if i < (len(layers) - 2): fc_model.add( Activation('relu') ) fc_model.summary() return fc_model fc_model_1 = build_fc_model([2, 256, 512, 1024, 1])
and here is the error message:
TypeError: Could not interpret activation function identifier: 256
Advertisement
Answer
This error indicates that, you have defined an activation function that is not interpretable. In your definition of a dense layer you have passed two argument as layers[i]
and layers[i+1]
.
Based on the docs here for the Dense
function:
The first argument is number of units (neurons) and the second parameter is activation function. So, it considers layers[i+1]
as an activation function that could not be recognized by the Dense
function.
Inference:
You do not need to pass next layer neurons to your dense layer. So remove layers[i+1]
argument.
Furthermore, you have to define an input layer for your model and pass the input shape to it for your model.
Therefore, modified code should be like this:
from keras.layers.core import Dense, Activation, Dropout from keras.layers.recurrent import LSTM from keras.models import Sequential from keras.callbacks import ModelCheckpoint from keras.models import load_model from keras.optimizers import Adam from keras.regularizers import l2 from keras.activations import relu, elu, linear, sigmoid from keras.layers import InputLayer #import input layer def build_fc_model(layers): fc_model = Sequential() fc_model.add(InputLayer(input_shape=(784,))) #add input layer and specify it's shape for i in range(len(layers)-1): fc_model.add( Dense(layers[i]) ) #remove unnecessary second argument if i < (len(layers) - 2): fc_model.add( Activation('relu') ) fc_model.add( Dropout(0.5) ) fc_model.summary() return fc_model fc_model_1 = build_fc_model([2, 256, 512, 1024, 1])