brainpy.math.autograd.jacrev
brainpy.math.autograd.jacrev#
- brainpy.math.autograd.jacrev(func, grad_vars=None, dyn_vars=None, argnums=None, holomorphic=False, allow_int=False, has_aux=None, return_value=False)[source]#
Extending automatic Jacobian (reverse-mode) of
func
to classes.This function extends the JAX official
jacrev
to make automatic jacobian computation on functions and class functions. Moreover, it supports returning value (“return_value”) and returning auxiliary data (“has_aux”).Same as brainpy.math.grad, the returns are different for different argument settings in
brainpy.math.jacrev
.When “grad_vars” is None
“has_aux=False” + “return_value=False” =>
arg_grads
.“has_aux=True” + “return_value=False” =>
(arg_grads, aux_data)
.“has_aux=False” + “return_value=True” =>
(arg_grads, loss_value)
.“has_aux=True” + “return_value=True” =>
(arg_grads, loss_value, aux_data)
.
When “grad_vars” is not None and “argnums” is None
“has_aux=False” + “return_value=False” =>
var_grads
.“has_aux=True” + “return_value=False” =>
(var_grads, aux_data)
.“has_aux=False” + “return_value=True” =>
(var_grads, loss_value)
.“has_aux=True” + “return_value=True” =>
(var_grads, loss_value, aux_data)
.
When “grad_vars” is not None and “argnums” is not None
“has_aux=False” + “return_value=False” =>
(var_grads, arg_grads)
.“has_aux=True” + “return_value=False” =>
((var_grads, arg_grads), aux_data)
.“has_aux=False” + “return_value=True” =>
((var_grads, arg_grads), loss_value)
.“has_aux=True” + “return_value=True” =>
((var_grads, arg_grads), loss_value, aux_data)
.
- Parameters
func (Function whose Jacobian is to be computed.) –
dyn_vars (optional, JaxArray, sequence of JaxArray, dict) – The dynamically changed variables used in
func
.grad_vars (optional, JaxArray, sequence of JaxArray, dict) – The variables in
func
to take their gradients.has_aux (optional, bool) – Indicates whether
fun
returns a pair where the first element is considered the output of the mathematical function to be differentiated and the second element is auxiliary data. Default False.return_value (bool) – Whether return the loss value.
argnums (Optional, integer or sequence of integers. Specifies which) – positional argument(s) to differentiate with respect to (default
0
).holomorphic (Optional, bool. Indicates whether
fun
is promised to be) – holomorphic. Default False.allow_int (Optional, bool. Whether to allow differentiating with) – respect to integer valued inputs. The gradient of an integer input will have a trivial vector-space dtype (float0). Default False.