class oneflow.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda, last_step=- 1, verbose=False)

Sets the learning rate of each parameter group to the initial lr times a given function. When last_step=-1, sets initial lr as lr.

\[learning\_rate = base\_learning\_rate*lambda(last\_step)\]
  • optimizer (Optimizer) – Wrapped optimizer.

  • lr_lambda (function or list) – A function which computes a multiplicative factor given an integer parameter epoch, or a list of such functions, one for each group in optimizer.param_groups.

  • last_step (int, optional) – The index of last step. (default: -1)

  • verbose (bool, optional) – If True, prints a message to stdout for each update. (default: False)

For example:

import oneflow as flow

lambda1 = lambda step: step // 30
lambda2 = lambda step: 0.95 * step
lambda_lr = flow.optim.lr_scheduler.LambdaLR(optimizer, [lambda1, lambda2])
for epoch in range(num_epoch):
__init__(optimizer, lr_lambda, last_step=- 1, verbose=False)

Initialize self. See help(type(self)) for accurate signature.


__delattr__(name, /)

Implement delattr(self, name).


Default dir() implementation.

__eq__(value, /)

Return self==value.

__format__(format_spec, /)

Default object formatter.

__ge__(value, /)

Return self>=value.

__getattribute__(name, /)

Return getattr(self, name).

__gt__(value, /)

Return self>value.


Return hash(self).

__init__(optimizer, lr_lambda[, last_step, …])

Initialize self.


This method is called when a class is subclassed.

__le__(value, /)

Return self<=value.

__lt__(value, /)

Return self<value.

__ne__(value, /)

Return self!=value.


Create and return a new object.


Helper for pickle.

__reduce_ex__(protocol, /)

Helper for pickle.


Return repr(self).

__setattr__(name, value, /)

Implement setattr(self, name, value).


Size of object in memory, in bytes.


Return str(self).


Abstract classes can override this to customize issubclass().



Return last computed learning rate by current scheduler.

get_lr(base_lr, step)

Compute learning rate using chainable form of the scheduler


Loads the schedulers state.

print_lr(group, lr)

Display the current learning rate.


Returns the state of the scheduler as a dict.


Performs a single learning rate schedule step.