This is both a proposal and a call for interested undergraduate and 
graduate students:

Automatic differentiation is a technique for computing exact numerical 
derivatives of user-provided code, as opposed to using finite difference 
approximations which introduce approximation errors. These techniques have 
a number of applications in statistics, machine learning, optimization, and 
other fields. Julia as a language is particularly suitable for implementing 
automatic differentiation, and the existing capabilities are already beyond 
those of Scipy and MATLAB. We propose a project with the following 
components:

1. Experiment with the new fast tuple and SIMD features of Julia 0.4 to 
develop a blazing fast stack-allocated implementation of DualNumbers with 
multiple epsilon components. Integrate with existing packages like Optim, 
JuMP, NLsolve, etc., and measure the performance gains over existing 
implementations.

2. Combine this work with the ForwardDiff package, which aims to provide a 
unified interface to different techniques for forward-mode automatic 
differentiation, including for higher-order derivatives.

3. Time permitting, take a step towards the reverse mode of automatic 
differentiation. Possible projects include developing a new implementation 
of reverse-mode AD based on the expression-graph format used by JuMP or 
contributing to existing packages such as ReverseDiffSource and 
ReverseDiffOverload.

There are quite a number of interesting projects in this area (some with 
avenues for publication), so we can adjust the work according to the 
student's interests. An ideal student should be interested in experimenting 
with state-of-the-art techniques to make code fast. No mathematical 
background beyond calculus is needed. See juliadiff.org for more info.

Co-mentors: Miles Lubin and Theodore Papamarkou

If this sounds cool and interesting to you, do get in touch!

Reply via email to