oneflow.nn.functional¶
oneflow.nn.functional
Convolution functions¶
Applies a 1D convolution over an input signal composed of several input planes. |
|
Applies a 2D convolution over an input image composed of several input planes. |
|
Applies a 3D convolution over an input image composed of several input planes. |
|
Applies a 1D transposed convolution operator over an input signal composed of several input planes, sometimes also called “deconvolution”. |
|
Applies a 2D transposed convolution operator over an input image composed of several input planes, sometimes also called “deconvolution”. |
|
Applies a 3D transposed convolution operator over an input image composed of several input planes, sometimes also called “deconvolution”. |
Pooling functions¶
Applies a 1D average pooling over an input signal composed of several input planes. |
|
Applies 2D average-pooling operation in \(kH \times kW\) regions by step size \(sH \times sW\) steps. |
|
Applies 3D average-pooling operation in \(kT \times kH \times kW\) regions by step size \(sT \times sH \times sW\) steps. |
|
Applies a 1D max pooling over an input signal composed of several input planes. |
|
Applies a 2D max pooling over an input signal composed of several input planes. |
|
Applies a 3D max pooling over an input signal composed of several input planes. |
|
Applies a 1D adaptive average pooling over an input signal composed of several input planes. |
|
Applies a 2D adaptive average pooling over an input signal composed of several input planes. |
|
Applies a 3D adaptive average pooling over an input signal composed of several input planes. |
Non-linear activation functions¶
Applies the rectified linear unit function element-wise. |
|
Applies the HardTanh function element-wise. |
|
Applies the hardswish function, element-wise, as described in the paper: |
|
Applies the element-wise function \(\text{ReLU6}(x) = \min(\max(0,x), 6)\). |
|
Applies element-wise, |
|
Applies element-wise function |
|
Applies the element-wise function: |
|
Applies element-wise, :math:` ext{LeakyReLU}(x) = max(0, x) + ext{negative_slope} * min(0, x)` |
|
Applies the element-wise function: |
|
The equation is: |
|
The equation is: |
|
Applies the element-wise function: |
|
The formula is: |
|
Applies the element-wise function: |
|
Softmax is defined as: |
|
LogSoftmax is defined as: |
|
The equation is: |
|
Applies the element-wise function \(\text{Sigmoid}(x) = \frac{1}{1 + \exp(-x)}\) |
|
Applies the element-wise function |
|
The formula is: |
|
Applies the element-wise function: |
|
Applies Layer Normalization for last certain number of dimensions. |
|
Performs \(L_p\) normalization of inputs over specified dimension |
Linear functions¶
Applies a linear transformation to the incoming data: \(y = xA^T + b\). |
Dropout functions¶
During training, randomly zeroes some of the elements of the input tensor with probability |
Loss functions¶
The interface is consistent with TensorFlow. |
|
See |
|
Creates a criterion that measures the triplet loss given an input tensors \(x1\), \(x2\), \(x3\) and a margin with a value greater than \(0\). |
Vision functions¶
Pads tensor. |
|
The interface is consistent with PyTorch. |
|
The interface is consistent with PyTorch. |
|
The interface is consistent with PyTorch. |