oneflow.nn.LeakyReLU

class oneflow.nn.LeakyReLU(negative_slope: float = 0.01, inplace: bool = False)

Applies the element-wise function:

\[\begin{split}\text{LeakyRELU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \text{negative_slope} \times x, & \text{ otherwise } \end{cases}\end{split}\]
Parameters
  • negative_slope – Controls the angle of the negative slope. Default: 1e-2

  • inplace – can optionally do the operation in-place. Default: False

Shape:
  • Input: \((N, *)\) where * means, any number of additional dimensions

  • Output: \((N, *)\), same shape as the input

For example:

>>> import numpy as np
>>> import oneflow as flow

>>> m = flow.nn.LeakyReLU(0.1)
>>> arr = np.array([0.2, 0.3, 3.0, 4.0])
>>> x = flow.Tensor(arr)
>>> out = m(x)
>>> out
tensor([0.2000, 0.3000, 3.0000, 4.0000], dtype=oneflow.float32)