Skip to content
Advertisement

In a convolutional neural network, how do I use Maxout instead of ReLU as an activation function?

model = Sequential()
    
    model.add(Conv2D(256, (3, 3), input_shape=X.shape[1:]))
    model.add(Activation('relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))

How do I use Maxout instead of’relu’ for activation?

Advertisement

Answer

You can use tensorflow_addons.layers.Maxout to add Maxout Activation function

import tensorflow_addons as tfa

model = Sequential()
model.add(Conv2D(256, (3, 3), input_shape=X.shape[1:]))
model.add(tfa.layers.Maxout(256))
model.add(MaxPooling2D(pool_size=(2, 2)))

You can install tensorflow_addons by:

pip install tensorflow-addons
Advertisement