Registers a backward hook.

The hook will be called every time a gradient with respect to the Tensor is computed. The hook should have the following signature:

hook(grad) -> Tensor or None

The hook should not modify its argument, but it can optionally return a new gradient which will be used in place of grad.

For example:

>>> import oneflow as flow
>>> x = flow.ones(5, requires_grad=True)
>>> def hook(grad):
...     return grad * 2
>>> x.register_hook(hook)
>>> y = x * 2
>>> y.sum().backward()
>>> x.grad
tensor([4., 4., 4., 4., 4.], dtype=oneflow.float32)