Activations#

Collection of Ivy neural network activations as stateful classes.

class ivy.stateful.activations.GEGLU[source]#

Bases: Module

__init__()[source]#

Apply the GEGLU activation function.

class ivy.stateful.activations.GELU(*, approximate=False)[source]#

Bases: Module

__init__(*, approximate=False)[source]#

Apply the GELU activation function.

class ivy.stateful.activations.LeakyReLU(alpha=0.2)[source]#

Bases: Module

__init__(alpha=0.2)[source]#

Apply the LEAKY RELU activation function.

Parameters:

alpha (float) – Negative slope for ReLU. (default: 0.2)

class ivy.stateful.activations.LogSoftmax[source]#

Bases: Module

__init__()[source]#

Apply the LOG SOFTMAX activation function.

class ivy.stateful.activations.Mish[source]#

Bases: Module

__init__()[source]#

Apply the MISH activation function.

class ivy.stateful.activations.ReLU[source]#

Bases: Module

__init__()[source]#

Apply the RELU activation function.

class ivy.stateful.activations.SiLU[source]#

Bases: Module

__init__()[source]#

Apply the SiLU activation function.

class ivy.stateful.activations.Softmax[source]#

Bases: Module

__init__()[source]#

Apply the SOFTMAX activation function.

class ivy.stateful.activations.Softplus[source]#

Bases: Module

__init__()[source]#

Apply the SOFTPLUS activation function.

This should have hopefully given you an overview of the activations submodule, if you have any questions, please feel free to reach out on our discord in the activations channel or in the activations forum!