binary_cross_entropy#

ivy.binary_cross_entropy(true, pred, /, *, from_logits=False, epsilon=0.0, reduction='none', pos_weight=None, axis=None, out=None)[source]#

Compute the binary cross entropy loss.

Parameters:
  • true (Union[Array, NativeArray]) – input array containing true labels.

  • pred (Union[Array, NativeArray]) – input array containing Predicted labels.

  • from_logits (bool) – Whether pred is expected to be a logits tensor. By (default: False) default, we assume that pred encodes a probability distribution.

  • epsilon (float) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the (default: 0.0) loss. If epsilon is 0, no smoothing will be applied. Default: 0.

  • reduction (str) – 'none': No reduction will be applied to the output. (default: 'none') 'mean': The output will be averaged. 'sum': The output will be summed. Default: 'none'.

  • pos_weight (Optional[Union[Array, NativeArray]]) – a weight for positive examples. Must be an array with length equal to the number (default: None) of classes.

  • axis (Optional[int]) – Axis along which to compute crossentropy. (default: None)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape (default: None) that the inputs broadcast to.

Return type:

Array

Returns:

ret – The binary cross entropy between the given distributions.

Functional Examples

With ivy.Array input:

>>> x = ivy.array([0, 1, 0, 0])
>>> y = ivy.array([0.2, 0.8, 0.3, 0.8])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223,0.223,0.357,1.61])
>>> x = ivy.array([[0, 1, 1, 0]])
>>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]])
>>> z = ivy.binary_cross_entropy(x, y, reduction='mean')
>>> print(z)
ivy.array(7.6666193)
>>> x = ivy.array([[0, 1, 1, 0]])
>>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]])
>>> pos_weight = ivy.array([1, 2, 3, 4])
>>> z = ivy.binary_cross_entropy(x, y, pos_weight=pos_weight, from_logits=True)
ivy.array([[2.67164493e+00, 4.05471958e-03, 7.32684899e-02, 5.30496836e+00]])
>>> x = ivy.array([[0, 1, 1, 0]])
>>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]])
>>> pos_weight = ivy.array([1, 2, 3, 4])
>>> z = ivy.binary_cross_entropy(x, y, pos_weight=pos_weight, from_logits=True, reduction='sum', axis=1) # noqa: E501
ivy.array([8.05393649])
>>> x = ivy.array([[0, 1, 1, 0]])
>>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]])
>>> z = ivy.binary_cross_entropy(x, y, reduction='none', epsilon=0.5)
ivy.array([[11.49992943,  3.83330965,  3.83330965, 11.49992943]])
>>> x = ivy.array([[0, 1, 0, 0]])
>>> y = ivy.array([[0.6, 0.2, 0.7, 0.3]])
>>> z = ivy.binary_cross_entropy(x, y, epsilon=1e-3)
>>> print(z)
ivy.array([[0.916,1.61,1.2,0.357]])

With ivy.NativeArray input:

>>> x = ivy.native_array([0, 1, 0, 1])
>>> y = ivy.native_array([0.2, 0.7, 0.2, 0.6])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223,0.357,0.223,0.511])

With a mix of ivy.Array and ivy.NativeArray inputs:

>>> x = ivy.array([0, 0, 1, 1])
>>> y = ivy.native_array([0.1, 0.2, 0.8, 0.6])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.105,0.223,0.223,0.511])

With ivy.Container input:

>>> x = ivy.Container(a=ivy.array([1, 0, 0]),b=ivy.array([0, 0, 1]))
>>> y = ivy.Container(a=ivy.array([0.6, 0.2, 0.3]),b=ivy.array([0.8, 0.2, 0.2]))
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
{a:ivy.array([0.511,0.223,0.357]),b:ivy.array([1.61,0.223,1.61])}

With a mix of ivy.Array and ivy.Container inputs:

>>> x = ivy.array([1 , 1, 0])
>>> y = ivy.Container(a=ivy.array([0.7, 0.8, 0.2]))
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
{
   a: ivy.array([0.357, 0.223, 0.223])
}

Instance Method Examples

Using ivy.Array instance method:

>>> x = ivy.array([1, 0, 0, 0])
>>> y = ivy.array([0.8, 0.2, 0.2, 0.2])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223, 0.223, 0.223, 0.223])
Array.binary_cross_entropy(self, pred, /, *, from_logits=False, epsilon=0.0, reduction='none', pos_weight=None, axis=None, out=None)#

ivy.Array instance method variant of ivy.binary_cross_entropy. This method simply wraps the function, and so the docstring for ivy.binary_cross_entropy also applies to this method with minimal changes.

Parameters:
  • self (Array) – input array containing true labels.

  • pred (Union[Array, NativeArray]) – input array containing Predicted labels.

  • from_logits (bool) – Whether pred is expected to be a logits tensor. By (default: False) default, we assume that pred encodes a probability distribution.

  • epsilon (float) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating (default: 0.0) the loss. If epsilon is 0, no smoothing will be applied. Default: 0.

  • reduction (str) – 'none': No reduction will be applied to the output. (default: 'none') 'mean': The output will be averaged. 'sum': The output will be summed. Default: 'none'.

  • pos_weight (Optional[Union[Array, NativeArray]]) – a weight for positive examples. Must be an array with length equal (default: None) to the number of classes.

  • axis (Optional[int]) – Axis along which to compute crossentropy. (default: None)

  • out (Optional[Array]) – optional output array, for writing the result to. It must have a shape (default: None) that the inputs broadcast to.

Return type:

Array

Returns:

ret – The binary cross entropy between the given distributions.

Examples

>>> x = ivy.array([1 , 1, 0])
>>> y = ivy.array([0.7, 0.8, 0.2])
>>> z = x.binary_cross_entropy(y)
>>> print(z)
ivy.array([0.357, 0.223, 0.223])
Container.binary_cross_entropy(self, pred, /, *, from_logits=False, epsilon=0.0, reduction='none', pos_weight=None, axis=None, key_chains=None, to_apply=True, prune_unapplied=False, map_sequences=False, out=None)#

ivy.Container instance method variant of ivy.binary_cross_entropy. This method simply wraps the function, and so the docstring for ivy.binary_cross_entropy also applies to this method with minimal changes.

Parameters:
  • self (Container) – input container containing true labels.

  • pred (Union[Container, Array, NativeArray]) –

    input array or container containing Predicted labels. from_logits

    Whether pred is expected to be a logits tensor. By default, we assume that pred encodes a probability distribution.

  • epsilon (Union[float, Container]) – a float in [0.0, 1.0] specifying the amount of smoothing when (default: 0.0) calculating the loss. If epsilon is 0, no smoothing will be applied. Default: 0.

  • reduction (str) – 'none': No reduction will be applied to the output. (default: 'none') 'mean': The output will be averaged. 'sum': The output will be summed. Default: 'none'.

  • pos_weight (Optional[Union[Array, NativeArray, Container]]) – a weight for positive examples. Must be an array with length equal (default: None) to the number of classes.

  • axis (Optional[int]) – Axis along which to compute crossentropy. (default: None)

  • key_chains (Optional[Union[List[str], Dict[str, str]]]) – The key-chains to apply or not apply the method to. Default is None. (default: None)

  • to_apply (bool) – If True, the method will be applied to key_chains, otherwise key_chains (default: True) will be skipped. Default is True.

  • prune_unapplied (bool) – Whether to prune key_chains for which the function was not applied. (default: False) Default is False.

  • map_sequences (bool) – Whether to also map method to sequences (lists, tuples). (default: False) Default is False.

  • out (Optional[Container]) – optional output container, for writing the result to. It must have a shape (default: None) that the inputs broadcast to.

Return type:

Container

Returns:

ret – The binary cross entropy between the given distributions.

Examples

>>> x = ivy.Container(a=ivy.array([1, 0, 0]),b=ivy.array([0, 0, 1]))
>>> y = ivy.Container(a=ivy.array([0.6, 0.2, 0.3]),b=ivy.array([0.8, 0.2, 0.2]))
>>> z = x.binary_cross_entropy(y)
>>> print(z)
{
    a: ivy.array([0.511, 0.223, 0.357]),
    b: ivy.array([1.61, 0.223, 1.61])
}