helios.scheduler.schedulers

Classes

CosineAnnealingRestartLR

A cosine annealing with restarts LR scheduler.

MultiStepRestartLR

Multi-step with restarts LR scheduler.

Module Contents

class helios.scheduler.schedulers.CosineAnnealingRestartLR(optimizer: torch.optim.Optimizer, periods: list[int], restart_weights: list[int] | None = None, eta_min: float = 0, last_epoch: int = -1)[source]

Bases: torch.optim.lr_scheduler.LRScheduler

A cosine annealing with restarts LR scheduler.

Example

Given

periods = [10, 10, 10, 10]
restart_weights = [1, 0.5, 0.5, 0.5]
eta_min = 1e-7

Then the scheduler will have 4 cycles of 10 iterations each. At the 10th, 20th, and 30th, the scheduler will restart with the weights in restart_weights.

Parameters:
  • optimizer – the optimizer.

  • periods – period for each cosine annealing cycle.

  • restart_weights – (optional) restarts weights at each restart iteration.

  • eta_min – The minimum lr. Defaults to 0

  • last_epoch – Used in _LRScheduler. Defaults to -1.

get_lr()[source]

Return the current learning rate.

class helios.scheduler.schedulers.MultiStepRestartLR(optimizer: torch.optim.Optimizer, milestones: list[int], gamma: float = 0.1, restarts: list[int] | None = None, restart_weights: list[int] | None = None, last_epoch: int = -1)[source]

Bases: torch.optim.lr_scheduler.LRScheduler

Multi-step with restarts LR scheduler.

Parameters:
  • optimizer – torch optimizer.

  • milestones – iterations that will decrease learning rate.

  • gamma – decrease ratio. Defaults to 0.1.

  • restarts – (optional) restart iterations.

  • restart_weights – (optional) restart weights at each restart iteration.

  • last_epoch – used in _LRScheduler. Defaults to -1.

get_lr()[source]

Return the current learning rate.