oneflow.optim.lr_scheduler.ChainedScheduler

class oneflow.optim.lr_scheduler.ChainedScheduler(schedulers)

Chains list of learning rate schedulers. It takes a list of chainable learning rate schedulers and performs consecutive step() functions belong to them by just one call.

Parameters

schedulers (list) – List of chained schedulers.

Example

>>> # Assuming optimizer uses lr = 1. for all groups
>>> # lr = 0.09     if step == 0
>>> # lr = 0.081    if step == 1
>>> # lr = 0.729    if step == 2
>>> # lr = 0.6561   if step == 3
>>> # lr = 0.59049  if step >= 4
>>> scheduler1 = ConstantLR(self.opt, factor=0.1, total_iters=2)
>>> scheduler2 = ExponentialLR(self.opt, gamma=0.9)
>>> scheduler = ChainedScheduler([scheduler1, scheduler2])
>>> for _ in range(100):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()
__init__(schedulers)

Initialize self. See help(type(self)) for accurate signature.

Methods

__delattr__(name, /)

Implement delattr(self, name).

__dir__()

Default dir() implementation.

__eq__(value, /)

Return self==value.

__format__(format_spec, /)

Default object formatter.

__ge__(value, /)

Return self>=value.

__getattribute__(name, /)

Return getattr(self, name).

__gt__(value, /)

Return self>value.

__hash__()

Return hash(self).

__init__(schedulers)

Initialize self.

__init_subclass__

This method is called when a class is subclassed.

__le__(value, /)

Return self<=value.

__lt__(value, /)

Return self<value.

__ne__(value, /)

Return self!=value.

__new__(**kwargs)

Create and return a new object.

__reduce__()

Helper for pickle.

__reduce_ex__(protocol, /)

Helper for pickle.

__repr__()

Return repr(self).

__setattr__(name, value, /)

Implement setattr(self, name, value).

__sizeof__()

Size of object in memory, in bytes.

__str__()

Return str(self).

__subclasshook__

Abstract classes can override this to customize issubclass().

_generate_conf_for_graph(lr_conf)

_init_base_lrs()

get_last_lr()

Return last computed learning rate by current scheduler.

get_lr(base_lr, step)

Compute learning rate using chainable form of the scheduler

load_state_dict(state_dict)

Loads the schedulers state.

print_lr(group, lr)

Display the current learning rate.

state_dict()

Returns the state of the scheduler as a dict.

step()

update_lrs(lrs)