oneflow.optim.lr_scheduler.LambdaLR¶
-
class
oneflow.optim.lr_scheduler.
LambdaLR
(optimizer, lr_lambda, last_step=- 1, verbose=False)¶ Sets the learning rate of each parameter group to the initial lr times a given function. When last_step=-1, sets initial lr as lr.
\[learning\_rate = base\_learning\_rate*lambda(last\_step)\]- Parameters
optimizer (Optimizer) – Wrapped optimizer.
lr_lambda (function or list) – A function which computes a multiplicative factor given an integer parameter epoch, or a list of such functions, one for each group in optimizer.param_groups.
last_step (int, optional) – The index of last step. (default: -1)
verbose (bool, optional) – If
True
, prints a message to stdout for each update. (default:False
)
For example:
import oneflow as flow ... lambda1 = lambda step: step // 30 lambda2 = lambda step: 0.95 * step lambda_lr = flow.optim.lr_scheduler.LambdaLR(optimizer, [lambda1, lambda2]) for epoch in range(num_epoch): train(...) lambda_lr.step()
-
__init__
(optimizer, lr_lambda, last_step=- 1, verbose=False)¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__delattr__
(name, /)Implement delattr(self, name).
__dir__
()Default dir() implementation.
__eq__
(value, /)Return self==value.
__format__
(format_spec, /)Default object formatter.
__ge__
(value, /)Return self>=value.
__getattribute__
(name, /)Return getattr(self, name).
__gt__
(value, /)Return self>value.
__hash__
()Return hash(self).
__init__
(optimizer, lr_lambda[, last_step, …])Initialize self.
__init_subclass__
This method is called when a class is subclassed.
__le__
(value, /)Return self<=value.
__lt__
(value, /)Return self<value.
__ne__
(value, /)Return self!=value.
__new__
(**kwargs)Create and return a new object.
__reduce__
()Helper for pickle.
__reduce_ex__
(protocol, /)Helper for pickle.
__repr__
()Return repr(self).
__setattr__
(name, value, /)Implement setattr(self, name, value).
__sizeof__
()Size of object in memory, in bytes.
__str__
()Return str(self).
__subclasshook__
Abstract classes can override this to customize issubclass().
_init_base_lrs
()get_last_lr
()Return last computed learning rate by current scheduler.
get_lr
(base_lr, step)Compute learning rate using chainable form of the scheduler
load_state_dict
(state_dict)Loads the schedulers state.
print_lr
(group, lr)Display the current learning rate.
state_dict
()Returns the state of the scheduler as a
dict
.step
()Performs a single learning rate schedule step.
update_lrs
(lrs)