# oneflow.optim.lr_scheduler.PolynomialLR¶

class oneflow.optim.lr_scheduler.PolynomialLR(optimizer, decay_batch: int, end_learning_rate: float = 0.0001, power: float = 1.0, cycle: bool = False, last_step: int = - 1, verbose: bool = False)

This operator creates a polynomial decayed learning rate scheduler. The learning rate will be updated as follows:

If cycle is True, the equation is:

\begin{split}\begin{aligned} & decay\_batch = decay\_batch*ceil(\frac{current\_batch}{decay\_batch}) \\ & learning\_rate = (base\_lr-end\_lr)*(1-\frac{current\_batch}{decay\_batch})^{power}+end\_lr \end{aligned}\end{split}

If cycle is False, the equation is:

\begin{split}\begin{aligned} & current\_batch = min(decay\_batch, current\_batch) \\ & learning\_rate = (base\_lr-end\_lr)*(1-\frac{current\_batch}{decay\_batch})^{power}+end\_lr \end{aligned}\end{split}
Parameters
• optimizer (Optimizer) – Wrapper optimizer.

• decay_batch (int) – The decayed steps.

• end_learning_rate (float, optional) – The final learning rate. Defaults to 0.0001.

• power (float, optional) – The power of polynomial. Defaults to 1.0.

• cycle (bool, optional) – If cycle is True, the scheduler will decay the learning rate every decay steps. Defaults to False.

For example:

import oneflow as flow

...
polynomial_scheduler = flow.optim.lr_scheduler.PolynomialLR(
optimizer, decay_batch=5, end_learning_rate=0.00001, power=2
)

for epoch in range(num_epoch):
train(...)
polynomial_scheduler.step()

__init__(optimizer, decay_batch: int, end_learning_rate: float = 0.0001, power: float = 1.0, cycle: bool = False, last_step: int = - 1, verbose: bool = False)

Initialize self. See help(type(self)) for accurate signature.

Methods

 __delattr__(name, /) Implement delattr(self, name). __dir__() Default dir() implementation. __eq__(value, /) Return self==value. __format__(format_spec, /) Default object formatter. __ge__(value, /) Return self>=value. __getattribute__(name, /) Return getattr(self, name). __gt__(value, /) Return self>value. __hash__() Return hash(self). __init__(optimizer, decay_batch[, …]) Initialize self. __init_subclass__ This method is called when a class is subclassed. __le__(value, /) Return self<=value. __lt__(value, /) Return self