class brainpy.algorithms.offline.LogisticRegression(learning_rate=0.1, gradient_descent=True, max_iter=4000, name=None)[source]#

Logistic regression method for offline training.

  • learning_rate (float) – The step length that will be taken when following the negative gradient during training.

  • gradient_descent (boolean) – True or false depending on if gradient descent should be used when training. If false then we use batch optimization by least squares.

  • max_iter (int) – The number of iteration to optimize the parameters.

  • name (str) – The name of the algorithm.

__init__(learning_rate=0.1, gradient_descent=True, max_iter=4000, name=None)[source]#


__init__([learning_rate, gradient_descent, ...])

call(identifier, targets, inputs[, outputs])

The training procedure.

gradient_descent_solve(targets, inputs[, ...])

init_weights(n_features, n_out)

Initialize weights randomly [-1/N, 1/N]

initialize(identifier, *args, **kwargs)

load_states(filename[, verbose])

Load the model states.

nodes([method, level, include_self])

Collect all children nodes.

predict(W, X)

register_implicit_nodes(*nodes, **named_nodes)

register_implicit_vars(*variables, ...)

save_states(filename[, variables])

Save the model states.

train_vars([method, level, include_self])

The shortcut for retrieving all trainable variables.

unique_name([name, type_])

Get the unique name for this object.

vars([method, level, include_self])

Collect all variables in this node and the children nodes.



Name of the model.