[Learn about machine learning from the Keras] — 15.Custom activation and built-in activation

Czxdas
3 min readSep 22, 2023

Layer specifies which activation to use in the following ways (here, “relu” is used as an example):

(1) Specify the function name ‘relu’ (built-in default)

Sample code:

from tensorflow.keras.models import Sequential
from tensorflow.keras import layers
from tensorflow.keras.models import Model

model = Sequential([
layers.Dense(10, activation="relu")
])

According to the specified function name ‘relu’ in the sample program, the following functions will be passed:
keras.activations.get
-> keras.activations.deserialize
-> keras.saving.serialization_lib.deserialize_keras_object
-> keras.saving.serialization_lib.serialize_with_public_fn
-> keras.saving.serialization_lib._retrieve_class_or_fn

It should be noted that the information is packaged into a dict string through keras.saving.serialization_lib.serialize_with_public_fn.
The string content is
{‘class_name’: ‘function’, ‘config’: ‘relu’, ‘module’: ‘keras.activations’, ‘registered_name’: ‘relu’} ,
And pass in keras.saving.serialization_lib._retrieve_class_or_fn and specify obj_type = ‘function’, and then return the activation object.

(2) Specify the class name ‘ReLU’ (built-in default)

Sample code:

from tensorflow.keras.models import Sequential
from tensorflow.keras import layers
from tensorflow.keras.models import Model

model = Sequential([
layers.Dense(10, activation="ReLU")
])

Note the case of names.
According to the specified function name ‘ReLU’ in the sample program, the following function will be passed:
keras.activations.get
-> keras.activations.deserialize
-> keras.saving.serialization_lib.deserialize_keras_object
-> keras.saving.serialization_lib.serialize_with_public_class
-> keras.saving.serialization_lib._retrieve_class_or_fn

Here, the information is packaged into a dict string through keras.saving.serialization_lib.serialize_with_public_class.
The string content is
{‘class_name’: ‘ReLU’, ‘config’: None, ‘module’: ‘keras.layers’, ‘registered_name’: None} ,
And pass in keras.saving.serialization_lib._retrieve_class_or_fn and specify obj_type = ‘class’, and then return the activation object.

In fact, specifying ‘relu’ and ‘ReLU’ is the same. Only one is to use function conversion directly, and the other is to use the call function of the implementation class.
Both (1)(2) will find the corresponding activation object through a mapping. The mapping content is as follows:

class:
‘ELU’: keras.layers.activation.elu.ELU
‘LeakyReLU’: keras.layers.activation.leaky_relu.LeakyReLU
‘PReLU’: keras.layers.activation.prelu.PReLU
‘ReLU’: keras.layers.activation.relu.ReLU
‘Softmax’: keras.layers.activation.softmax.Softmax
‘ThresholdedReLU’: keras.layers.activation.thresholded_relu.ThresholdedReLU

function:
‘elu’ : keras.activations.elu
‘linear’ : keras.activations.linear
‘exponential’ : keras.activations.exponential
‘gelu’ : keras.activations.gelu
‘hard_sigmoid’ : keras.activations.hard_sigmoid
‘leaky_relu’ : keras.activations.leaky_relu
‘mish’ : keras.activations.mish
‘relu’ : keras.activations.relu
‘selu’ : keras.activations.selu
‘sigmoid’ : keras.activations.sigmoid
‘silu’ : keras.activations.silu
‘softmax’ : keras.activations.softmax
‘softplus’ : keras.activations.softplus
‘softsign’ : keras.activations.softsign
‘swish’ : keras.activations.swish
‘tanh’ : keras.activations.tanh

(3) Custom classes

Sample code:

from tensorflow.keras.models import Sequential
from tensorflow.keras import layers
from tensorflow.keras.models import Model

class MyReLU(layers.Layer):
def __init__(self, max_value=None, negative_slope=0.0, threshold=0.0, **kwargs):
super().__init__(**kwargs)
self.DefinedFunction = layers.ReLU(max_value, negative_slope, threshold)

def call(self, inputs):
return self.DefinedFunction.call(inputs)

model = Sequential([
layers.Dense(10, activation= MyReLU(max_value=None, negative_slope=0.0, threshold=0.0) )
])

This example shows a custom activation class, which must inherit keras.layers.Layer and implement the init and call functions. The call function will be called due to the conversion of the parent class __call__. This example just reuses the built-in relu, and the purpose is to make a simple custom example to show what must be implemented.
So if you want to customize the activation class, you can implement this call function, put the formula to be converted here, and remember to return the result of the conversion through the formula.

(4) Use built-in classes directly

Sample code:

from tensorflow.keras.models import Sequential
from tensorflow.keras import layers
from tensorflow.keras.models import Model

model = Sequential([
layers.Dense(10, activation= layers.ReLU(max_value=None, negative_slope=0.0, threshold=0.0))
])

The fourth example directly uses the built-in class. You can try it directly without going into details.

The above are several ways to obtain and set activation, which are recorded here.

--

--