Further to the recent discussion on lazy evaluation & numba, I moved what I was doing into a new project:
PyAutoDiff: https://github.com/jaberg/pyautodiff It currently works by executing CPython bytecode with a numpy-aware engine that builds a symbolic expression graph with Theano... so you can do for example: >>> import autodiff, numpy as np >>> autodiff.fmin_l_bfgs_b(lambda x: (x + 1) ** 2, [np.zeros(())]) ... and you'll see `[array(-1.0)]` printed out. In the future, I think it should be able to export the gradient-computing function as bytecode, which could then be optimized by e.g. numba or a theano bytecode front-end. For now it just compiles and runs the Theano graph that it built. It's still pretty rough (you'll see if you look at the code!) but I'm excited about it. - James _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion