Church's λ-Calculus (2023) [pdf]

jstrieb | 155 points

I've got Harper's Practical Foundations for Programming Languages and it's a great book—he writes clearly and succinctly.

Knowing the lambda calculus helped me one time when I was working as a software engineer: I had just added functions to a little DSL interpreter that was going to make it easy to customize behavior of our product to different customers. (It wasn't ever going to be used by customers directly; it was primarily a tool for the internal team.) It was at this point that I realized we needed some kind of execution time-out: since we could encode functions, we could write down e.g. the Y combinator or the omega combinator and we could get non-terminating programs in this DSL.

Now I work as a programming languages researcher, so the lambda calculus has direct application to my day job.

Those curious might be interested in ISWIM, [1,2] which is an extension of the lambda calculus with an arbitrary set of operators. Like the lambda calculus, this is an abstract language. However, you can add numbers and other operators, and ISWIM-like languages are often used to illustrate new ideas in programming languages.

Syntax is incidental. Boil the syntax away from languages and reduce them to their distilled semantics to get out the essential differences—this is the kind of thing that the lambda calculus makes easy.

I highly recommend reading Landin's The Next 700 Programming Languages [2] as it is a great, short, clear read.

[1]: https://en.wikipedia.org/wiki/ISWIM [2]: https://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf

ashton314 | 13 days ago

One invention that drew me to this topic was Binary Lambda Calculus, invented by John Tromp (“Tromp” as in Tromp-Taylor Rules). It’s just a direct binary encoding of a lambda calculus term, but it gives you a concrete and concise representation which is useful for evaluating the complexity of an expression. I encountered it when viewing https://codegolf.stackexchange.com/questions/6430/shortest-t... and found that this language was the one that could write the most precise representation of a very large number with the fewest bits possible. I later learned that Graham’s number could be encoded in 120 bits (maybe 3~4 less), much more concise than equivalent mathematical language, and I’ve since been drawn into the field of googology. It was fascinating.

renonce | 13 days ago

When I took Fundamentals of Programming Languages 20 years ago - I nearly failed the class. Lambda calculus was simply too esoteric for me to appreciate and much less understand intuitively.

Fast forward 20 years, and I see the fundamentals of Alonzo Church’s system in every computation problem I encounter. It’s one of those concepts that age like wine. The only other concept I put on the same level is Shannons “informational entropy” and maybe Wolfram’s Ruliad.

an-allen | 13 days ago

There has to be some programmer rite of passage involving learning lambda calculus. I went through this a few years ago when something similar was posted. I learned the notation, marveled at its dual simplicity and completeness, and and did a few exercises. But I came out the other side none the wiser. I think I was looking for some great epiphany on the nature of computation. Alas, it eluded me in the end. It was fun but ultimately useless for me.

dexwiz | 13 days ago