brainpy.optimizers module#

Optimizers#

Optimizer(lr[, train_vars, name])

Base Optimizer Class.

SGD(lr[, train_vars, name])

Stochastic gradient descent optimizer.

Momentum(lr[, train_vars, momentum, name])

Momentum optimizer.

MomentumNesterov(lr[, train_vars, momentum, ...])

Nesterov accelerated gradient optimizer [2]_.

Adagrad(lr[, train_vars, epsilon, name])

Optimizer that implements the Adagrad algorithm.

Adadelta([lr, train_vars, epsilon, rho, name])

Optimizer that implements the Adadelta algorithm.

RMSProp(lr[, train_vars, epsilon, rho, name])

Optimizer that implements the RMSprop algorithm.

Adam(lr[, train_vars, beta1, beta2, eps, name])

Optimizer that implements the Adam algorithm.

LARS(lr[, train_vars, momentum, ...])

Layer-wise adaptive rate scaling (LARS) optimizer [1]_.

Adan([lr, train_vars, betas, eps, ...])

Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models [1]_.

AdamW(lr[, train_vars, beta1, beta2, eps, ...])

Adam with weight decay regularization [1]_.

Schedulers#

make_schedule(scalar_or_schedule)

Scheduler(lr)

The learning rate scheduler.

Constant(lr)

ExponentialDecay(lr, decay_steps, decay_rate)

InverseTimeDecay(lr, decay_steps, decay_rate)

PolynomialDecay(lr, decay_steps, final_lr[, ...])

PiecewiseConstant(boundaries, values)