oneflow.nn.functional

Convolution functions

conv1d

Applies a 1D convolution over an input signal composed of several input planes.

conv2d

Applies a 2D convolution over an input image composed of several input planes.

conv3d

Applies a 3D convolution over an input image composed of several input planes.

conv_transpose1d

Applies a 1D transposed convolution operator over an input signal composed of several input planes, sometimes also called “deconvolution”.

conv_transpose2d

Applies a 2D transposed convolution operator over an input image composed of several input planes, sometimes also called “deconvolution”.

conv_transpose3d

Applies a 3D transposed convolution operator over an input image composed of several input planes, sometimes also called “deconvolution”.

Pooling functions

avg_pool1d

Applies a 1D average pooling over an input signal composed of several input planes.

avg_pool2d

Applies 2D average-pooling operation in \(kH \times kW\) regions by step size \(sH \times sW\) steps.

avg_pool3d

Applies 3D average-pooling operation in \(kT \times kH \times kW\) regions by step size \(sT \times sH \times sW\) steps.

max_pool1d

Applies a 1D max pooling over an input signal composed of several input planes.

max_pool2d

Applies a 2D max pooling over an input signal composed of several input planes.

max_pool3d

Applies a 3D max pooling over an input signal composed of several input planes.

adaptive_avg_pool1d

Applies a 1D adaptive average pooling over an input signal composed of several input planes.

adaptive_avg_pool2d

Applies a 2D adaptive average pooling over an input signal composed of several input planes.

adaptive_avg_pool3d

Applies a 3D adaptive average pooling over an input signal composed of several input planes.

Non-linear activation functions

threshold

relu

Applies the rectified linear unit function element-wise.

hardtanh

Applies the HardTanh function element-wise.

hardswish

Applies the hardswish function, element-wise, as described in the paper:

relu6

Applies the element-wise function \(\text{ReLU6}(x) = \min(\max(0,x), 6)\).

elu

Applies element-wise,

selu

Applies element-wise function

celu

Applies the element-wise function:

leaky_relu

Applies element-wise, :math:` ext{LeakyReLU}(x) = max(0, x) + ext{negative_slope} * min(0, x)`

prelu

Applies the element-wise function:

glu

The equation is:

gelu

The equation is:

logsigmoid

Applies the element-wise function:

hardshrink

softsign

The formula is:

softplus

Applies the element-wise function:

softmax

Softmax is defined as:

softshrink

log_softmax

LogSoftmax is defined as:

tanh

The equation is:

sigmoid

Applies the element-wise function \(\text{Sigmoid}(x) = \frac{1}{1 + \exp(-x)}\)

hardsigmoid

Applies the element-wise function

silu

The formula is:

mish

Applies the element-wise function:

layer_norm

Applies Layer Normalization for last certain number of dimensions.

normalize

Performs \(L_p\) normalization of inputs over specified dimension

Linear functions

linear

Applies a linear transformation to the incoming data: \(y = xA^T + b\).

Dropout functions

dropout

During training, randomly zeroes some of the elements of the input tensor with probability p using samples from a Bernoulli distribution.

Sparse functions

embedding

A simple lookup table that looks up embeddings in a fixed dictionary and size.

one_hot

This operator generates a onehot Tensor from input Tensor.

Loss functions

sparse_softmax_cross_entropy

The interface is consistent with TensorFlow.

cross_entropy

See CrossEntropyLoss for details.

smooth_l1_loss

triplet_margin_loss

Creates a criterion that measures the triplet loss given an input tensors \(x1\), \(x2\), \(x3\) and a margin with a value greater than \(0\).

Vision functions

pad

Pads tensor.

interpolate

The interface is consistent with PyTorch.

grid_sample

The interface is consistent with PyTorch.

affine_grid

The interface is consistent with PyTorch.