oneflow.nn.SiLU¶
-
class
oneflow.nn.
SiLU
(inplace: bool = False)¶ SiLU(Swish) activation:
\[\text{SiLU}(x) = x * sigmoid(x)\]Note
See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.
- Shape:
Input: \((N, *)\) where * means, any number of additional dimensions
Output: \((N, *)\), same shape as the input
For example:
>>> import numpy as np >>> import oneflow as flow >>> x = np.array([1, 2, 3]).astype(np.float32) >>> input = flow.Tensor(x) >>> silu = flow.nn.SiLU() >>> out = silu(input) >>> out tensor([0.7311, 1.7616, 2.8577], dtype=oneflow.float32)