Skip to content
Advertisement

How to make a custom activation function with trainable parameters in Tensorflow [closed]

Would like to understand how to define user defined activation functions for neural networks with two learnable parameters, using tensorflow in python.

Any reference will be helpul or a case study?

Thank you

Advertisement

Answer

If you create a tf.Variable within your model, Tensorflow will track its state and will adjust it as any other parameter. Such a tf.Variable can be a parameter from your activation function.

Let’s start with some toy dataset.

JavaScript

Now, let’s create a tf.keras.Model and make a parametric ReLU function with the slope being learnable, and also the minimum value (usually 0 for classical ReLU). Let’s start with a PReLU slope/min value of 0.1 for now.

JavaScript

Now, let’s train the model (in eager mode so we can keep the slope values).

JavaScript

Let’s look at the slope. Tensorflow is adjusting it to be the best slope for this task. As you will see it approaches non-parametric ReLU with a slope of 1.

JavaScript

enter image description here

User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement