Activations#

Collection of Ivy activation functions.

ivy.deserialize(name, /, *, custom_objects=None)[source]#

Return activation function given a string identifier.

Parameters:
  • name (Optional[str]) – The name of the activation function.

  • custom_objects (Optional[Dict]) – Optional dictionary listing user-provided activation functions. (default: None)

Return type:

Optional[Callable]

Returns:

ret – Corresponding activation function.

Examples

With str input:

>>> name = "sigmoid"
>>> sigmoid = ivy.deserialize(name)
>>> print(sigmoid)
<function sigmoid at XXXXXXXXXXXXXX>

With str and dict input:

>>> name = "custom_fn"
>>> objects = {"custom_fn": lambda x: x}
>>> custom_fn = ivy.deserialize(name, custom_objects=objects)
>>> print(custom_fn)
<function custom_fn at XXXXXXXXXXXXXX>
ivy.gelu(x, /, *, approximate=False, out=None)[source]#

Apply the Gaussian error linear unit (GELU) activation function.

Parameters:
  • x (Union[Array, NativeArray]) – Input array.

  • approximate (bool) – Whether to approximate, default is True. (default: False)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – The input array with gelu applied element-wise.

Examples

With ivy.Array input:

>>> x = ivy.array([-1.2, -0.6, 1.5])
>>> y = ivy.gelu(x)
>>> y
ivy.array([-0.138, -0.165, 1.4])

With ivy.NativeArray input:

>>> x = ivy.native_array([-1.3, 3.8, 2.1])
>>> y = ivy.gelu(x)
>>> y
ivy.array([-0.126, 3.8, 2.06])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([1., 2.]), b=ivy.array([-0.9, -1.]))
>>> y = ivy.gelu(x)
>>> y
{
    a: ivy.array([0.841, 1.95]),
    b: ivy.array([-0.166, -0.159])
}
ivy.get(name, /, *, custom_objects=None)[source]#

Return activation function given a string identifier.

Parameters:
  • name (Optional[str]) – The name of the activation function.

  • custom_objects (Optional[Dict]) – Optional dictionary listing user-provided activation functions. (default: None)

Return type:

Optional[Callable]

Returns:

ret – Corresponding activation function.

Examples

With str input:

>>> name = "sigmoid"
>>> sigmoid = ivy.get(name)
>>> print(sigmoid)
<function sigmoid at XXXXXXXXXXXXXX>
>>> name = None
>>> linear = ivy.get(name)
>>> print(linear)
<function linear at XXXXXXXXXXXXXX>

With str and dict input:

>>> name = "custom_fn"
>>> objects = {"custom_fn": lambda x: x}
>>> custom_fn = ivy.get(name, custom_objects=objects)
>>> print(custom_fn)
<function custom_fn at XXXXXXXXXXXXXX>
ivy.leaky_relu(x, /, *, alpha=0.2, out=None)[source]#

Apply the leaky rectified linear unit function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – Input array.

  • alpha (float) – Negative slope for ReLU. (default: 0.2)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – The input array with leaky relu applied element-wise.

Examples

With ivy.Array input:

>>> x = ivy.array([0.39, -0.85])
>>> y = ivy.leaky_relu(x)
>>> print(y)
ivy.array([ 0.39, -0.17])
>>> x = ivy.array([1.5, 0.7, -2.4])
>>> y = ivy.zeros(3)
>>> ivy.leaky_relu(x, out=y)
>>> print(y)
ivy.array([ 1.5 ,  0.7 , -0.48])
>>> x = ivy.array([[1.1, 2.2, 3.3],
...                [-4.4, -5.5, -6.6]])
>>> ivy.leaky_relu(x, out=x)
>>> print(x)
ivy.array([[ 1.1 ,  2.2 ,  3.3 ],
   [-0.88, -1.1 , -1.32]])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([0.0, -1.2]), b=ivy.array([0.4, -0.2]))
>>> x = ivy.leaky_relu(x, out=x)
>>> print(x)
{
    a: ivy.array([0., -0.24000001]),
    b: ivy.array([0.40000001, -0.04])
}
ivy.log_softmax(x, /, *, axis=None, out=None)[source]#

Apply the log_softmax function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – Input array.

  • axis (Optional[int]) – The dimension log_softmax would be performed on. The default is None. (default: None)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – The output array with log_softmax applied element-wise to input.

Examples

With ivy.Array input:

>>> x = ivy.array([-1.0, -0.98])
>>> y = ivy.log_softmax(x)
>>> print(y)
ivy.array([-0.703, -0.683])
>>> x = ivy.array([1.0, 2.0, 3.0])
>>> y = ivy.log_softmax(x)
>>> print(y)
ivy.array([-2.41, -1.41, -0.408])

With ivy.NativeArray input:

>>> x = ivy.native_array([1.5, 0.5, 1.0])
>>> y = ivy.log_softmax(x)
>>> print(y)
ivy.array([-0.68, -1.68, -1.18])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([1.5, 0.5, 1.0]))
>>> y = ivy.log_softmax(x)
>>> print(y)
{
    a: ivy.array([-0.68, -1.68, -1.18])
}
>>> x = ivy.Container(a=ivy.array([1.0, 2.0]), b=ivy.array([0.4, -0.2]))
>>> y = ivy.log_softmax(x)
>>> print(y)
{
    a: ivy.array([-1.31, -0.313]),
    b: ivy.array([-0.437, -1.04])
}
ivy.mish(x, /, *, out=None)[source]#

Apply the mish activation function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – input array

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – an array containing the mish activation of each element in x.

Examples

With ivy.Array input:

>>> x = ivy.array([-1., 0., 1.])
>>> y = ivy.mish(x)
>>> print(y)
ivy.array([-0.30340147,  0.        ,  0.86509842])
>>> x = ivy.array([1.5, 0.7, -2.4])
>>> y = ivy.zeros(3)
>>> ivy.mish(x, out = y)
>>> print(y)
ivy.array([ 1.40337825,  0.56114835, -0.20788449])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([1.0, -1.2]), b=ivy.array([0.4, -0.2]))
>>> x = ivy.mish(x)
>>> print(x)
{
    a: ivy.array([0.86509842, -0.30883577]),
    b: ivy.array([0.28903052, -0.10714479])
}
ivy.relu(x, /, *, out=None)[source]#

Apply the rectified linear unit function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – input array

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – an array containing the rectified linear unit activation of each element in x.

Examples

With ivy.Array input:

>>> x = ivy.array([-1., 0., 1.])
>>> y = ivy.relu(x)
>>> print(y)
ivy.array([0., 0., 1.])
>>> x = ivy.array([1.5, 0.7, -2.4])
>>> y = ivy.zeros(3)
>>> ivy.relu(x, out = y)
>>> print(y)
ivy.array([1.5, 0.7, 0.])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([1.0, -1.2]), b=ivy.array([0.4, -0.2]))
>>> x = ivy.relu(x, out=x)
>>> print(x)
{
    a: ivy.array([1., 0.]),
    b: ivy.array([0.40000001, 0.])
}
ivy.sigmoid(x, /, *, out=None)[source]#

Apply the sigmoid function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – input array.

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) input broadcast to. default: None

Return type:

Array

Returns:

ret – an array containing the sigmoid activation of each element in x.

Examples

With ivy.Array input:

>>> x = ivy.array([-1.0, 1.0, 2.0])
>>> y = ivy.sigmoid(x)
>>> print(y)
ivy.array([0.269, 0.731, 0.881])
>>> x = ivy.array([-1.0, 1.0, 2.0])
>>> y = x.sigmoid()
>>> print(y)
ivy.array([0.269, 0.731, 0.881])
>>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]])
>>> y = ivy.sigmoid(x)
>>> print(y)
ivy.array([[0.214, 0.978, 0.891], [0.846,0.985,0.001]] )
ivy.softmax(x, /, *, axis=None, out=None)[source]#

Apply the softmax function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – Input array.

  • axis (Optional[int]) – The dimension softmax would be performed on. The default is None. (default: None)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – The input array with softmax applied element-wise.

Examples

With ivy.Array input:

>>> x = ivy.array([1.0, 0, 1.0])
>>> y = ivy.softmax(x)
>>> print(y)
ivy.array([0.422, 0.155, 0.422])
>>> x = ivy.array([[1.1, 2.2, 3.3],
...                [4.4, 5.5, 6.6]])
>>> y = ivy.softmax(x, axis = 1)
>>> print(y)
ivy.array([[0.0768, 0.231 , 0.693 ],
           [0.0768, 0.231 , 0.693 ]])
ivy.softplus(x, /, *, beta=None, threshold=None, out=None)[source]#

Apply the softplus function element-wise.

Parameters:
  • x (Union[Array, NativeArray]) – input array.

  • beta (Optional[Union[int, float]]) – The beta value for the softplus formation. Default: None. (default: None)

  • threshold (Optional[Union[int, float]]) – values above this revert to a linear function. Default: None. (default: None)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape that the (default: None) inputs broadcast to.

Return type:

Array

Returns:

ret – an array containing the softplus activation of each element in x.

Functional Examples

With ivy.Array input:

>>> x = ivy.array([-0.3461, -0.6491])
>>> y = ivy.softplus(x)
>>> print(y)
ivy.array([0.535,0.42])
>>> x = ivy.array([-0.3461, -0.6491])
>>> y = ivy.softplus(x, beta=0.5)
>>> print(y)
ivy.array([1.22, 1.09])
>>> x = ivy.array([1., 2., 3.])
>>> y = ivy.softplus(x, threshold=2)
>>> print(y)
ivy.array([1.31, 2.13, 3.  ])

This should have hopefully given you an overview of the activations submodule, if you have any questions, please feel free to reach out on our discord in the activations channel or in the activations forum!