Combinatory Logic


Combinatory logic is 'simplified' way of doing the lambda calculus. In the lambda calculus there are variables and constants, applications, and abstractions; combinatory logic makes do with variables and constants, and applications only—abstractions are done away with. There are more rules for reduction in combinatory logic, but since all variables are free there are none of the complications of capturing, sound substitutions, and name rewrites. Let us take a simple example, in the lambda calculus the identity combinator is λx.x and its effect when applied to an arbitrary argument, say y, is to reduce to y

(λx.x y) -> y

In combinatory logic there is a combinator for identity, written I (*note the boldface*) and it is governed by the reduction rule

I y -> y

(*Notice here we are using juxtaposition for application and left association with that, so 'I y' amounts to '(I y)'. Both of these conventions are fairly standard. Left association fits in with currying, so 'X arg1 arg2 arg3' means '(((X arg1) arg2) arg3)' ie multiple arguments are swallowed one at a time in the order they appear. The conventions are also standard in straight lambda calculus. We did not have to invoke them there because for clarity for teaching we were putting all applications in brackets anyway.*)

Roughly combinatory logic takes the combinators of lambda calculus, leaves out their internal structure, and presents directly the rules for their reduction.

Syntax of combinator expressions

<combex> ::= <atom> | <application> | (<combex>)
<atom> ::= <variable>|<constant>|<combinator>
<application> ::= <combex> <combex>

for example, x,y, L avariable, M avariable y x are all combinator expressions. (*As mentioned, it is usual to use juxtaposition for application (so, again, application is a silent operator, for example I y means I@y), together with left association and, round brackets to improve clarity or to get right association.*)

A distinguished combinator expression, which has a reduction rule, is usually called a combinator and it written in boldface (not the easiest to do with pen and paper).

You can see what the problem is going to be for combinators. In the lambda calculus the occurrences of the variable of abstraction within the body of an abstraction are just place markers to indicate where the argument is going to go when there is a beta-reduction. The problem for combinators is whether they can pick out all possible positions of the place markers. It turns out that theoretically everything can be done with just two combinators, the S and K combinators. Let us start by looking at them.

The reduction rule for K is:-

K x y -> x

The combinator K consumes two arguments and reduces to the first argument. So K is just a projection function, sometimes called a cancellator.

The reduction rule for S is:-

S x y z -> x z (y z)

The combinator S consumes three arguments and reduces to the first argument applied to the third argument applied to the second argument applied to the third argument.

Let us write the silent application in for one moment... S@x@y@z -> x@z@(y@z)

What S does is somewhat lacking in an intuitive counterpart. Part of it is composition-- the first argument applied to the third argument applied to the second argument applied to the third argument-- is the composition of two function applications. But the first function application here is the first argument applied to the third argument, which is a sort of substitution. So... S does is not have an easy interpretation, but it is a substitution-and-composition function (sometimes called a distributor).

S and K are called the standard combinators. A combinator expression in which the only combinators are the standard ones is called a standard combinator expression.

It is handy to have the identity combinator

I =df. S K K

( * note that this reduces I x =df. S K K x and S K K x ->K x (K x)-> x which is what we want. *)

(*For the exercises below... Select a line. Choose Rewrite Rules. Select a specific Rewrite Rule. Select a (sub)expression that you wish to rewrite with that rule. Press Go.*)

Like in the lambda calculus, there are the notions of redex, reduction path, normal form...

Combinatory logic is going to do away with the abstractions of lambda calculus. Doing this is known as bracket abstraction and the bracket abstraction algorithm removes all abstractions from lambda expressions and replaces them with a standard combinator expression.

Sketch of a bracket abstraction algorithm. What we are going to do here is to translate lamdba calculus expressions into SKI combinator expressions. We'll allow S, K, and I into the lambda calculus and translate variables, constants, and applications as themselves unchanged. All we have to consider are abstractions.

Input: a variable x and an abstraction λx.(M) of a standard combinator expression M i.e. M itself has already been translated into and SKI combinator expression.
Output: A standard combinator expression <prefix>M such that <prefix>M is equal to λx.(M)


i) M is a constant c then let <prefix>M be K M (let us just check that these reduce in the same way. λx.(M) is λx.(<constant>) and beta reduce the application of this to any argument λx.(<constant>) arg and <constant> is returned; similarly, apply K <constant> to any argument K <constant> arg and <constant> is returned.)

ii) M is a standard combinator C then let <prefix>M be K M (let us just check that these reduce in the same way. λx.(M) is λx.(C) and beta reduce the application of this to any argument λx.(C) arg and C is returned; similarly, apply K C to any argument K C arg and C is returned.)

iii) M is a variable

a) it is x, then let <prefix>M be I (*note, no M here *) (let us just check that these reduce in the same way. λx.(M) is λx.(x) and beta reduce the application of this to any argument λx.(x) arg and arg is returned; similarly, apply I to any argument I arg and arg is returned.)

b) it is a variable y other than x, then let <prefix>M be K M (let us just check that these reduce in the same way. λx.(M) is λx.(y) and beta reduce the application of this to any argument λx.(y) arg and y is returned; similarly, apply K y to any argument K y arg and y is returned.)

iv) M is an application P Q. Let <prefix1>P and<prefix2>Q be the standard combinator expression bracket abstractions of λx.P and λx.Q. Then what we need is S <prefix1>P <prefix2>Q (let us just check that these reduce in the same way. λx.(M) is λx.(P Q) and beta reduce the application of this to any argument λx.(P Q) arg and (λx.(P) arg) (λx.(Q) arg) is returned (you may have to do some rewrites of variables); similarly, apply S <prefix1>P <prefix2>Q to any argument S <prefix1>P <prefix2>Q arg and <prefix1>P arg (<prefix2>Q arg) is returned.)

We did use the identity combinator, but we have shown how to translate this into an SKI expression.

To sum up, we produce an abstraction function abs[x,E] such that abs[x,E][x] = E. It is defined

abs[x,x] = I
abs[x,y] = K y
abs[x,k] = K k
abs[x,E1[E2]] = S abs[x,E1] abs[x,E2]

What we have done here is to show that bracket abstraction is possible. In a running system, though, it might be important to do the translation efficiently—you may have lambda calculus as your functional programming language and combinatory logic as the framework of your graph reducer-- and the algorithm sketched here is not efficient.

As an example of bracket abstraction. Consider

λx.(x x) λy.(y) arg first everything other than the abstractions remains the same -> λx.(x x) λy.(y) arg then λx.(x x) translates to S I I and λy.(y) translates to I so the whole lot becomes S I I I arg and we can check the reductions

S I I I arg -> I I (I I ) arg -> I (I I ) arg ->I I arg -> I arg -> arg
λx.(x x) λy.(y) arg -> (λy.(y) λy.(y)) arg -> λy.(y) arg -> arg

It is usual to move beyond pure SK expression combinators and allow in all sorts of other combinators (which, of course, could be defined in terms of SK). This is primarily to reduce the length and complexity of the expressions.

Straight translations of lambda expressions into SK combinators are usually astonishingly long e.g. the translation of a simple addition function λx.λy.(+ x y) has 18 combinators in it (and many nested brackets). However at a conceptual and theoretical level, combinators and combinator graph reduction are extraordinarily simple. And combinator reduction is naturally lazy (being from left to right)

B x y z -> x (y z) (* compositor *)
C x y z -> x z y (* permutator*)
I x -> x (* identificator *)
K x y -> x (* cancellator *)
S x y z -> x z (y z) (* distributor *)
W x y -> x y y (* duplicator*)
Y f -> f (Y f) (* fixpoint. Just as a matter of interest define A as B (S I ) (S I I ) then Y is A A *)

Although combinators strictly speaking have only one argument (and use currying), we can talk of some of them expecting several arguments in as much as maybe several arguments are needed before the reduction rule can be invoked. e.g. B is a combinator of three arguments, I is a combinator of one.

Normal form

A CL formula is in normal form if no further reductions of it or any of its subformulas are possible.

If a CL formula has a normal form, that normal form is unique (the Church-Rosser theorem).

Not every CL formula has a normal form (because some contain reductions that can go on forever) e.g.

W W W -> W W W and so on forever

Notion of reduction path

Sometimes it is necessary to indicate which expression is being reduced

I (I x) -> I x (*there are two ways of doing this with a single I reduction *)

And, again like the lamdba calculus, some terms can produce a normal form under one reduction path, and non-termination under another

K x (W W W ) -> x (* using normal order reduction, *)

K x (W W W ) -> K x (W W W ) (* using applicative order reduction *)

What normal order and applicative order are is explained shortly.

Normal order reduction

Normal order reduction will produce the normal form if there is one, and so that is an argument for using normal order reduction.

In CL, normal order reduction is particularly simple

always do the leftmost redex first

head reduction or contraction , head reduction path (always work from left to right). Normal order reduction can be inefficient due to evaluating the same expressions more than once. For example, consider a lambda or CL expression which in effect adds x to x; informally let us write this as (x + x) and then apply this to the argument (3 x 2) i.e. informally


Normal order reduction would first reduce the function by substituting the (3 x 2) for the x i.e.

(x+x)[(3x2)]  -> (3x2)+(3x2)

but then the (3x2) will get evaluated twice.

Weak head normal form

As mentioned, some combinators 'expect' several arguments before they can be reduced. For example, the compositor B x y z

B x y z
expects 3. This means that if an expression starts with B but has only two arguments it cannot be reduced by a compositor reduction; e.g.
B x y
Now, this formula B x y is in normal form. But consider
B x (I y)
It is not in normal form, because the (I y) can be reduced by the identificator to y. This formula is in weak head normal form (WHNF). The focus here in on the first well-formed CL expression in a formula i.e. the head (whether or not that expression is a combinator with a reduction rule).  If the head (with its 'arguments') cannot be reduced, then the whole expression is in weak head normal form. Any expression in normal form is in weak head normal form. But there are expressions which are in weak head normal form which are not in normal form, e.g. B x (I y). 
A formula in WHNF either has a head which is not a combinator or it has a head which is a combinator but which does not have enough arguments for the combinator to reduce.

reduction strategies:- leftmost redex first is normal order.

The various Church-Rosser theorems on reduction order and uniqueness of normal form.

We can also do arithmetic, write recursive functions, build lists ... in pure combinatory logic (just as we could and did in pure lambda calculus).

However, like in lambda calculus, this is not a very sensible thing to do. Instead we extend combinatory logic to include what we want eg

ADD x y -> x' + y' (*where x' and y' and the evaluated versions of x and y *)
HEAD x -> i, if x is CONS i j

etc. etc.

Eventually,then our functional program will just become an expression in combinatory logic, and we will have several reduction rules. And the problem will be to reduce the expression to normal form.

For lazy evaluation we want to do head contraction only except i) if the head is strict (this means that it needs the values of its arguments, for example, predefineds like + need the numbers they are going to add (not expressions that evaluate to numbers), or ii) a subexpression is shared.

Usual to extend combinatory logic for functional programming by adding all the 'delta reductions'. Then it becomes the same as the extended lambda calculus.

An aside on Raymond Smullyan, Birds, and Combinatory Logic

Wikipedia: Raymond Smullyan (*love the photo*)

Forty years ago, yes it is that long, when we studied tableaux we used Smullyan's First-Order Logic as our text. It was excellent and I was somewhat astonished to learn that apparently Smullyan had been a professional magician until he was forty and then he became a logician. Sometime later I became aware of Smullyan's books of logical puzzles and conundrums. I was keen to use some of the individual puzzles as splash screens in my logic software. So I wrote to him, sending him my software, and asking him if I could do that. He handwrote me a most gracious reply, roughly to the effect that computers and software were a complete mystery to him but that I was welcome to use whatever I wished. Sometime after that I become interested in combinatory logic, and I became aware of Smullyan's [1985] To Mock a Mockingbird. Wikipedia: To Mock a Mockingbird. In this he explains combinatory logic via birds and the calls they make one to another. And there are sources for this on the web. You can try Chris Rathman Combinator Birds or Mockingbird Fun [this takes ages to load, over a minute on my broadband]. [I must say I never really understood the Mockingbirds, maybe I need to revisit them.]