THIS PROJECT HAS NOW BEEN MERGED INTO AUTOGRAD, WHICH SUPPORTS FORWARD MODE DIFFERENTIATION AS OF VERSION 1.2.
This package adds forward mode differentiation to the already fantastic Autograd. Efficient Jacobian vector product computation (analagous to Theano's Rop) and a more efficient Hessian vector product are included, and fully compatible with Autograd's grad operator, as well as its other convenience wrappers.
Autograd-forward enables you to do things like:
In : from autograd_forward import jacobian_vector_product In : import autograd.numpy as np In : def f(x): ...: return x**2 + 1 ...: In : jvp = jacobian_vector_product(f) In : x = np.array([1., 2., 3.]) In : v = np.array([4., 5., 6.]) In : jvp(x, v) Out: array([ 8., 20., 36.])
Mixing forward mode with Autograd's reverse mode operators 'just works':
In : from autograd import grad In : scalar_output_fun = lambda x, v: np.sum(jvp(x, v)) In : grad(scalar_output_fun)(x, v) Out: array([ 8., 10., 12.])
For functions which output a scalar, you can calculate Hessian vector products by doing:
In : def g(x): ...: return np.sum(x**3) ...: In : hvp = jacobian_vector_product(grad(g)) In : hvp(x, v) Out: array([ 24., 60., 108.])
Or you can use
autograd_forward.hessian_vector_product, with the
mode keyword argument set to
This package was written and is maintained by Jamie Townsend.
Right now, autograd-forward depends on the latest bleeding edge version of Autograd on Github. You can install this version of Autograd from Github by doing
pip install --upgrade git+https://github.com/HIPS/autograd.git
You can then install autograd-forward with
pip install --upgrade git+https://github.com/BB-UCL/autograd-forward.git
I've so far implemented forward derivatives for all of the Numpy primitives covered by Autograd, except those in
numpy.linalg, and I've also implemented some of the Scipy primitives. Please file an issue if there's something that you need to differentiate that isn't yet implemented.