oneflow.optim.lr_scheduler.SequentialLR

class oneflow.optim.lr_scheduler.SequentialLR(optimizer: oneflow.optim.optimizer.Optimizer, schedulers: Sequence[oneflow.nn.optimizer.lr_scheduler.LRScheduler], milestones: Sequence[int], interval_rescaling: Union[Sequence[bool], bool] = False, last_step: int = - 1, verbose: bool = False)

Receives the list of schedulers that is expected to be called sequentially during optimization process and milestone points that provides exact intervals to reflect which scheduler is supposed to be called at a given step.

Parameters
  • optimizer (Optimizer) – Wrapped optimizer.

  • schedulers (list) – List of chained schedulers.

  • milestones (list) – List of integers that reflects milestone points.

  • interval_rescaling (bool or list) – Each scheduler has a corresponding ‘interval_rescaling’. If it is set to True, scheduler will start and end at the same values as it would if it were the only scheduler, otherwise all schedulers share the same step. Default is False for all schedulers.

  • last_step (int) – The index of last step. Default: -1.

  • verbose (bool) – Default: False. Print lr if is set to True.

Example

>>> # Assuming optimizer uses lr = 1. for all groups
>>> # lr = 0.1     if step == 0
>>> # lr = 0.1     if step == 1
>>> # lr = 0.9     if step == 2
>>> # lr = 0.81    if step == 3
>>> # lr = 0.729   if step == 4
>>> scheduler1 = ConstantLR(self.opt, factor=0.1, total_iters=2)
>>> scheduler2 = ExponentialLR(self.opt, gamma=0.9)
>>> scheduler = SequentialLR(self.opt, schedulers=[scheduler1, scheduler2], milestones=[2])
>>> for step in range(100):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()
__init__(optimizer: oneflow.optim.optimizer.Optimizer, schedulers: Sequence[oneflow.nn.optimizer.lr_scheduler.LRScheduler], milestones: Sequence[int], interval_rescaling: Union[Sequence[bool], bool] = False, last_step: int = - 1, verbose: bool = False)

Initialize self. See help(type(self)) for accurate signature.

Methods

__delattr__(name, /)

Implement delattr(self, name).

__dir__()

Default dir() implementation.

__eq__(value, /)

Return self==value.

__format__(format_spec, /)

Default object formatter.

__ge__(value, /)

Return self>=value.

__getattribute__(name, /)

Return getattr(self, name).

__gt__(value, /)

Return self>value.

__hash__()

Return hash(self).

__init__(optimizer, schedulers, milestones)

Initialize self.

__init_subclass__

This method is called when a class is subclassed.

__le__(value, /)

Return self<=value.

__lt__(value, /)

Return self<value.

__ne__(value, /)

Return self!=value.

__new__(**kwargs)

Create and return a new object.

__reduce__()

Helper for pickle.

__reduce_ex__(protocol, /)

Helper for pickle.

__repr__()

Return repr(self).

__setattr__(name, value, /)

Implement setattr(self, name, value).

__sizeof__()

Size of object in memory, in bytes.

__str__()

Return str(self).

__subclasshook__

Abstract classes can override this to customize issubclass().

_generate_conf_for_graph(lr_conf)

_init_base_lrs()

get_last_lr()

Return last computed learning rate by current scheduler.

get_lr(base_lr, step)

Compute learning rate using chainable form of the scheduler

load_state_dict(state_dict)

Load the schedulers state.

print_lr(group, lr)

Display the current learning rate.

state_dict()

Return the state of the scheduler as a dict.

step()

update_lrs(lrs)