oneflow.nn.Softplus¶
-
class
oneflow.nn.
Softplus
(beta: int = 1, threshold: int = 20)¶ Applies the element-wise function:
\[\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))\]SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive.
For numerical stability the implementation reverts to the linear function when \(input \times \beta > threshold\).
- Parameters
beta – the \(\beta\) value for the Softplus formulation. Default: 1
threshold – values above this revert to a linear function. Default: 20
- Shape:
Input: \((N, *)\) where * means, any number of additional dimensions
Output: \((N, *)\), same shape as the input
For example:
>>> import numpy as np >>> import oneflow as flow >>> x = np.array([-0.5, 0, 0.5]).astype(np.float32) >>> input = flow.Tensor(x) >>> softplus = flow.nn.Softplus() >>> out = softplus(input) >>> out tensor([0.4741, 0.6931, 0.9741], dtype=oneflow.float32)