oneflow.nn.functional.gelu

oneflow.nn.functional.gelu(x: Tensor)Tensor

Applies the Gaussian Error Linear Units function:

\[\begin{split}\\text{GELU}(x) = x * \Phi(x)\end{split}\]

where \(\Phi(x)\) is the Cumulative Distribution Function for Gaussian Distribution.

When the approximate argument is ‘tanh’, Gelu is estimated with:

\[\begin{split}\\text{GELU}(x) = 0.5 * x * (1 + \\text{Tanh}(\sqrt(2 / \pi) * (x + 0.044715 * x^3)))\end{split}\]
Parameters
  • input (oneflow.Tensor) – Input Tensor

  • approximate (string, optional) – the gelu approximation algorithm to use: 'none' | 'tanh'. Default: 'none'

Returns

A Tensor has same shape as the input.

Return type

oneflow.Tensor

For example:

>>> import numpy as np
>>> import oneflow as flow

>>> x = np.array([-0.5, 0, 0.5]).astype(np.float32)
>>> input = flow.tensor(x)

>>> out = flow.gelu(input)
>>> out
tensor([-0.1543,  0.0000,  0.3457], dtype=oneflow.float32)

See GELU for more details.