13
votes

The closest-related implementation in Haskell I have seen is the forward mode at http://hackage.haskell.org/packages/archive/fad/1.0/doc/html/Numeric-FAD.html.

The closest related related research appears to be reverse mode for another functional language related to Scheme at http://www.bcl.hamilton.ie/~qobi/stalingrad/.

I see reverse mode in Haskell as kind of a holy grail for a lot of tasks, with the hopes that it could use Haskell's nested data parallelism to gain a nice speedup in heavy numerical optimization.

4
A possible alternative: I've had quite a bit of success with optimising large systems (eg. 10000 dimensional) with forward AD. (The code was C++ but largely written in a pure functional style.) The trick was to exploit the sparsity my problem had so I could use a sparse type to represent the derivatives. It was faster than the reverse AD version for my problem (again written in C++, but not at all pure).sigfpe
Really? I'm wondering how to accomplish such a thing. Any leads?Ian Fiske

4 Answers

56
votes

In response to this question, I've uploaded a package named ad to Hackage for handling reverse-mode automatic differentiation in Haskell.

Internally, it leverages a trick from Andy Gill's Kansas Lava to observe sharing in the tape it records for back propagation purposes, and uses type level branding to avoid confusing sensitivities.

I've tried to keep the API relatively close to that of Barak Pearlmutter and Jeffrey Mark Siskind's fad package, but I couldn't resist making a couple of minor tweaks here and there for generality.

I still need to go through and finish up the remaining unimplemented fad combinators, figure out a nice way to build a reverse-mode AD tower, validate that I didn't screw up my recollection of basic calculus, and provide a nice API for using this approach to get local reverse mode checkpoints in an otherwise forward mode AD program, but I am quite happy with how things have progressed thus far.

5
votes

We have a bunch of forward mode AD implementations (I even have one in my monoids library!), but reverse mode AD for all of Haskell seems to be intractable.

Sadly while Pearlmutter and Siskind give a translation for a lambda calculus, it doesn't map into something you can do for arbitrary Haskell lambdas, you don't get the right introspection properties and given the way the shape of the types change in the translation you don't get something that is amenable to being packed into a monad, arrow, or other control structure.

I had a go at it via a series of email exchanges with Pearlmutter, but ultimately the best I was able to obtain was a reverse mode AD solution for a small EDSL in Haskell, and not a solution for Haskell itself.

2
votes

Not that I'm aware of. I do know that some Haskell folks are interested in automatic differentiation, but some quick digging found little more than short asides mentioning the reverse mode; I expect you already found the same material I did.

I also note that the fad package and Stalingrad project you found are in fact the work of the same two people, and that at least Prof. Pearlmutter has posted to the haskell-cafe mailing list. You may want to consider contacting him directly about his work--it's possible he has something in progress, or hit serious obstacles while attempting to implement reverse-mode AD.

Sorry I couldn't turn up anything more useful; if someone else wants to dig further, at least the links above are a starting point.

2
votes

I think forward is the way to go in Haskell. You shouldn't be able to do reverse mode on arbitrary functions, as Edward pointed out. But you responded that you should be able to do it on certain constrained functions. And said constraints can lead readily to forward mode. Eg. if you have a function:

foo :: Num a => a -> a -> a

Then you can instantiate a with a differentiable type, and thus differentiate foo in forward mode.

See the vector-space library on Hackage for very elegant forward mode automatic differentiation. It might not be entirely clear how to use it at first. Read the paper about it, Beautiful Differentiation by Conal Elliott.