MultiStepLR

Contents

MultiStepLR#

class brainpy.optim.MultiStepLR(lr, milestones, gamma=0.1, last_epoch=-1)[source]#

Decays the learning rate of each parameter group by gamma once the number of epoch reaches one of the milestones. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler. When last_epoch=-1, sets initial lr as lr.

Parameters:
  • lr (float) – Initial learning rate.

  • milestones (sequence of int) – List of epoch indices. Must be increasing.

  • gamma (float) – Multiplicative factor of learning rate decay. Default: 0.1.

  • last_epoch (int) – The index of last epoch. Default: -1.