oneflow.nn.BCELoss

class oneflow.nn.BCELoss(weight: Optional[oneflow.Tensor] = None, reduction: str = 'mean')

This operator computes the binary cross entropy loss.

The equation is:

if reduction = “none”:

\[out = -(Target_i*log(Input_i) + (1-Target_i)*log(1-Input_i))\]

if reduction = “mean”:

\[out = -\frac{1}{n}\sum_{i=1}^n(Target_i*log(Input_i) + (1-Target_i)*log(1-Input_i))\]

if reduction = “sum”:

\[out = -\sum_{i=1}^n(Target_i*log(Input_i) + (1-Target_i)*log(1-Input_i))\]
Parameters
  • weight (oneflow.Tensor, optional) – The manual rescaling weight to the loss. Default to None, whose corresponding weight value is 1.

  • reduction (str, optional) – The reduce type, it can be one of “none”, “mean”, “sum”. Defaults to “mean”.

Attention

The input value must be in the range of (0, 1). Or the loss function may return nan value.

Returns

The result Tensor.

Return type

oneflow.Tensor

For example:

>>> import oneflow as flow
>>> import numpy as np
>>> input = flow.Tensor(np.array([[1.2, 0.2, -0.3], [0.7, 0.6, -2]]).astype(np.float32))
>>> target = flow.Tensor(np.array([[0, 1, 0], [1, 0, 1]]).astype(np.float32))
>>> weight = flow.Tensor(np.array([[2, 2, 2], [2, 2, 2]]).astype(np.float32))
>>> activation = flow.nn.Sigmoid()
>>> sigmoid_input = activation(input)
>>> m = flow.nn.BCELoss(weight, reduction="none")
>>> out = m(sigmoid_input, target)
>>> out
tensor([[2.9266, 1.1963, 1.1087],
        [0.8064, 2.0750, 4.2539]], dtype=oneflow.float32)
>>> m_sum = flow.nn.BCELoss(weight, reduction="sum")
>>> out = m_sum(sigmoid_input, target)
>>> out
tensor(12.3668, dtype=oneflow.float32)
>>> m_mean = flow.nn.BCELoss(weight, reduction="mean")
>>> out = m_mean(sigmoid_input, target)
>>> out
tensor(2.0611, dtype=oneflow.float32)
>>> m_none = flow.nn.BCELoss()
>>> out = m_none(sigmoid_input, target)
>>> out
tensor(1.0306, dtype=oneflow.float32)