brainpy.optimizers module#

Optimizers#

Optimizer(lr[, train_vars, name])

Base Optimizer Class.

SGD(lr[, train_vars, name])

Stochastic gradient descent optimizer.

Momentum(lr[, train_vars, momentum, name])

Momentum optimizer.

MomentumNesterov(lr[, train_vars, momentum, ...])

Nesterov accelerated gradient optimizer [2]_.

Adagrad(lr[, train_vars, epsilon, name])

Optimizer that implements the Adagrad algorithm.

Adadelta([train_vars, lr, epsilon, rho, name])

Optimizer that implements the Adadelta algorithm.

RMSProp(lr[, train_vars, epsilon, rho, name])

Optimizer that implements the RMSprop algorithm.

Adam(lr[, train_vars, beta1, beta2, eps, name])

Optimizer that implements the Adam algorithm.

LARS(lr[, train_vars, momentum, ...])

Layer-wise adaptive rate scaling (LARS) optimizer.

Schedulers#

make_schedule(scalar_or_schedule)

Scheduler(lr)

The learning rate scheduler.

Constant(lr)

ExponentialDecay(lr, decay_steps, decay_rate)

InverseTimeDecay(lr, decay_steps, decay_rate)

PolynomialDecay(lr, decay_steps, final_lr[, ...])

PiecewiseConstant(boundaries, values)