oneflow.nn.Softshrink¶
-
class
oneflow.nn.
Softshrink
(lambd: float = 0.5, inplace: bool = False)¶ The Softshrink activation.
The formula is:
\[\begin{split}\text{Softshrink}(x) = \begin{cases} x - \lambd, & \text{ if } x > \lambda \\ x + \lambd, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]The interface is consistent with PyTorch. The documentation is referenced from: https://pytorch.org/docs/1.10/generated/torch.nn.Softshrink.html.
- Parameters
lambd – the \(\lambda\) value for the Softshrink formulation. Default: 0.5
inplace – can optionally do the operation in-place. Default:
False
- Shape:
Input: \((N, *)\) where * means, any number of additional dimensions
Output: \((N, *)\), same shape as the input
For example:
>>> import numpy as np >>> import oneflow as flow >>> x = np.array([-1, 0, 0.2, 0.5]).astype(np.float32) >>> input = flow.Tensor(x) >>> softshrink = flow.nn.Softshrink(lambd=0.5) >>> out = softshrink(input) >>> out tensor([-0.5000, 0.0000, 0.0000, 0.0000], dtype=oneflow.float32)