oneflow.nn.ReLU¶
-
class
oneflow.nn.
ReLU
(inplace: bool = False)¶ Applies the rectified linear unit function element-wise:
\(\text{ReLU}(x) = (x)^+ = \max(0, x)\)
- Parameters
inplace – can optionally do the operation in-place. Default:
False
- Shape:
Input: \((N, *)\) where * means, any number of additional dimensions
Output: \((N, *)\), same shape as the input
For example:
>>> import oneflow as flow >>> import numpy as np >>> relu = flow.nn.ReLU() >>> ndarr = np.asarray([1, -2, 3]) >>> x = flow.Tensor(ndarr) >>> relu(x) tensor([1., 0., 3.], dtype=oneflow.float32)