5
votes

I'm learning functional programming with Clojure, and want to deepen my theoretical understanding of functional paradigm (not just the syntax of Clojure).

I'm looking for axioms or formulas how each functional techniques like recursion, map, reduce, cons, first ans rest relates to each other, which is derivable/composable from which, and which is the ultimate axiom behind everything.

For example, I realized map could be implemented only using recur, first, rest and cons functions, and of course the mapping function itself passed to map.

After that, I also realized map could be also implemented using reduce, and again reduce could be implemented using recur, first and rest. Also filter could be implemented with reduce.

I feel like I start to wrap my head around functional programming, but it is still hard to see which are the most ultimate building blocks, i.e. which is the minimum set of abstractions or keywords to compose arbitrary function. With the map example, the second method uses one less abstraction to target the same goal. So, what are some ultimate axioms of functional paradigm to help me see the big picture?

3
in this case i would recommend you to check out this book: mitpress.mit.edu/books/little-schemer It is an amazing intro to FP, showing you how to make things using just a basic number of elements. Also the classic SICP can help alot. Both of them employ lisp (scheme, to be exact), so it is easy to transfer your understanding to clojure - leetwinski
Functional programming is in large part rooted in Lamda Calculus and the core axiom is as amalloy states below. - Frank C.
Don't forget that map can also be replaced with for - Alan Thompson

3 Answers

5
votes

From lambda (called fn in clojure), you can derive anything else. For example, let's do the classic exercise of deriving cons, first, and rest with fn:

(defn cons [x y]
  (fn [f]
    (f x y)))

(defn first [coll]
  (coll (fn [x y] x)))

(defn rest [coll]
  (coll (fn [x y] y)))

So if you want a set of axioms for functional programming, there's just one: lambda the ultimate axiom. For details on how derivations of other features can be done, see such articles as:

  • The classic Lambda the Ultimate papers.
  • Programming with Nothing, a more recent approach. This uses Ruby syntax, but it doesn't really matter because the only language feature it uses is lambda.
  • SICP also has a section on deriving car/cdr/cons from lambda, as part of explaining the value of abstraction barriers: the implementation doesn't matter, as long as it satisfies the contracts you've established. And of course SICP is a good read in general if you're interested in the foundations of programming in general.

It seems from the comments like there is a lot of confusion about this answer; my fault for not explaining it for people who haven't seen it before.

The idea is not to reimplement all of clojure's built-in first/rest functionality, which are advanced polymorphic things that work on all sorts of sequences. Rather, we implement three cons/first/rest functions that work together to allow you to build collections, by satisfying the contract that

(= x (first (cons x y)))
(= y (rest (cons x y)))

You can build more complicated things like clojure's actual first/rest with just lambda, but you have to invent a whole type system first so it's a lot more involved.

Here's an example repl session describing what this exercise intended to demonstrate:

(defn cons [x y]
  (fn [f]
    (f x y)))

(defn first [coll]
  (coll (fn [x y] x)))

(defn rest [coll]
  (coll (fn [x y] y)))
user> (def integers (cons 0 (cons 1 (cons 2 (cons 3 nil)))))
#'user/integers
user> integers
#object[user$cons$fn__2108 0x3fb178bd "user$cons$fn__2108@3fb178bd"]
user> (first integers)
0
user> (first (rest (rest integers)))
2
1
votes

Start by understanding how a list is constructed in most functional languages, meaning, why it makes so much sense to look at a list as first and rest. This recursive definition is key to understanding the recursive mechanism by which you change them.

The way I first groked map/filter/fold etc was through haskell actually, which had the benefits of expressing the types of things. This made much sense for a beginner, at least for me.

For example, map has signature (a -> b) -> [a] -> [b] read as: if you have a function that takes a type a and turns it into a type b, and you give a list of type a, then map would simply return you a list of type b.

The one you should really take your time in understanding is the fold (both left and right), of which reduce is a special case of in a typed world. Once you feel ready, I would suggest going over the good old A tutorial on the universality and expressiveness of fold

I highly encourage you to try and implement everything you mentioned, your understanding of these basic building blocks and their dependencies will improve by much. Implement reduce (and both folds) in terms of recur, implement map filter take etc. both in terms of recur and of reduce etc.

1
votes

I don't think you can find a simple set of axioms analogous to those of probability theory, for example. For probability, there are only 3 fundamental axioms:

  • P[A] >= 0 for every event A
  • P["any event"] = 1
  • P[ A or B ] = P[A] + P[B] if A & B are mutually exclusive

It is quite surprising that everything in probability & statistics can be derived from these three fundamental assumptions.

"Functional Programming" is not so well defined. In fact, nearly every book on the subject starts off with the observation that if you asked 100 "experts" to define functional programming you would receive 100 mutually incompatible answers. This statement is only partly in jest.

The only thing you can really say about functional programming is that it emphasizes functions more than traditional or "non-functional" languages. This is really more of a "goal" for a functional language or functional programming style rather than a yes/no observation.

The goal of functional programming is the same as always: cost savings through greater simplicity & reliability. Of course, every language & technique since the beginning of computing has had this same game plan. FP aims to achieve it primarily by:

  • Reduce use of mutable variables
  • Increase use functions instead of manual loops

Note that it says "reduce" & "increase", not "only" & "never". Deciding what that means is a judgement-call, and the answer will change depending on the problem at hand and the person you ask.

Keep in mind that both problems & people change over time. The answer that is the "best" trade-off today between cost, complexity, efficiency, maintainability, etc will probably not be the "best" answer in a month or year as the problem, people, tools, hardware, prices etc. change over time.

Remember the Scientific Principal. It forces you to try things (experiment), not just think about them (theory). So you have to actually do something and observe the results.

In software, this means solving a problem in 2 (or more) ways and comparing the pluses & minuses of each approach.