oneflow.optim.lr_scheduler.MultiStepLR

class oneflow.optim.lr_scheduler.MultiStepLR(optimizer: oneflow.optim.optimizer.Optimizer, milestones: list, gamma: float = 0.1, last_step: int = - 1, verbose: bool = False)

Decays the learning rate of each parameter group by gamma once the number of step reaches one of the milestones. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler.When last_step=-1, sets initial lr as lr.

Parameters
  • optimizer (Optimizer) – Wrapped optimizer.

  • milestones (list) – List of step indices. Must be increasing

  • gamma (float, optional) – Multiplicative factor of learning rate decay. (default: 0.1)

  • last_step (int, optional) – The index of last step. (default: -1)

  • verbose (bool, optional) – If True, prints a message to stdout for each update. (default: False)

For example:

import oneflow as flow

...
multistep_lr = flow.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
for epoch in range(num_epoch):
    train(...)
    multistep_lr.step()
__init__(optimizer: oneflow.optim.optimizer.Optimizer, milestones: list, gamma: float = 0.1, last_step: int = - 1, verbose: bool = False)

Initialize self. See help(type(self)) for accurate signature.

Methods

__delattr__(name, /)

Implement delattr(self, name).

__dir__()

Default dir() implementation.

__eq__(value, /)

Return self==value.

__format__(format_spec, /)

Default object formatter.

__ge__(value, /)

Return self>=value.

__getattribute__(name, /)

Return getattr(self, name).

__gt__(value, /)

Return self>value.

__hash__()

Return hash(self).

__init__(optimizer, milestones[, gamma, …])

Initialize self.

__init_subclass__

This method is called when a class is subclassed.

__le__(value, /)

Return self<=value.

__lt__(value, /)

Return self<value.

__ne__(value, /)

Return self!=value.

__new__(**kwargs)

Create and return a new object.

__reduce__()

Helper for pickle.

__reduce_ex__(protocol, /)

Helper for pickle.

__repr__()

Return repr(self).

__setattr__(name, value, /)

Implement setattr(self, name, value).

__sizeof__()

Size of object in memory, in bytes.

__str__()

Return str(self).

__subclasshook__

Abstract classes can override this to customize issubclass().

_generate_conf_for_graph(lr_conf)

_init_base_lrs()

get_last_lr()

Return last computed learning rate by current scheduler.

get_lr(base_lr, step)

Compute learning rate using chainable form of the scheduler

load_state_dict(state_dict)

Load the schedulers state.

print_lr(group, lr)

Display the current learning rate.

state_dict()

Return the state of the scheduler as a dict.

step()

update_lrs(lrs)