oneflow.nn.GELU¶
-
class
oneflow.nn.
GELU
(approximate='none') → Tensor¶ The documentation is referenced from: https://pytorch.org/docs/1.10/generated/torch.nn.GELU.html.
Applies the Gaussian Error Linear Units function:
\[\text{GELU}(x) = x * \Phi(x)\]where \(\Phi(x)\) is the Cumulative Distribution Function for Gaussian Distribution.
When the approximate argument is ‘tanh’, Gelu is estimated with:
\[\text{GELU}(x) = 0.5 * x * (1 + \text{Tanh}(\sqrt(2 / \pi) * (x + 0.044715 * x^3)))\]- Parameters
input (oneflow.Tensor) – Input Tensor
approximate (string, optional) – the gelu approximation algorithm to use:
'none'
|'tanh'
. Default:'none'
- Returns
A Tensor has same shape as the input.
- Return type
For example:
>>> import numpy as np >>> import oneflow as flow >>> x = np.array([-0.5, 0, 0.5]).astype(np.float32) >>> input = flow.Tensor(x) >>> gelu = flow.nn.GELU() >>> out = gelu(input) >>> out tensor([-0.1543, 0.0000, 0.3457], dtype=oneflow.float32)