Matt Might's website has taught me so much over the years, going all the way back to ... I want to say 2008-2009? From programming languages to Unix-fu to a huge amount of topics in-between. I'm super glad to see his writing still being shared. One of my favorite corners of the web.
IMHO this and building an ALU with LEDs and logic gates should be part of ... well honestly any curriculum, even if you don't want to be study CS. Only doing that once in your life is enough to understand you could do it.
It was a "nerd" exploration few decades ago but nowadays so many of the things we do, from buying croissant to voting, is based on hardware and software. People should have a sense that yes it's complex but it's also NOT magic.
EForth and Subleq can be though too as a systems emerging from axioms (Subleq operations).
From twenty macro instructions in EForth it can write the core Subleq code and them bootstrap itself.
I think it is a lovely experience just because it forces you to think about which abstractions are the correct ones. I think many people have had the feeling that they would love to change one (or many) aspects of a programming language.
I have been playing with an s-expr based language that compiles to f sharp, and it has made me realize how much I think Rich Hickey made some very lovely choices for clojure. I have never written clojure more than just for fun, but the more in think about my own toy language, the more highly I think of Rich Hickey. Many times because of the choices he made, but even more because of how he compromised to be able to interop with java.
This is a statement vs expression thing. It works as an expression. Try:
f = function (v) { return e ; }
or
(function (v) { return e ; })
Javascript doesn't allow (nameless) function expressions (using the function keyword) where a function statement would work.
(and that particular function can also be written v => e)
As for the e being undefined when you execute the function, you should see it as a free variable [1], supposed to be defined elsewhere or replaced with something else.
Indeed! And since your article is not particularly about JS and can be read by anybody, including people not familiar with (modern) JS, I'd argue that the form you used is way clearer than the arrow function form and would probably still be a better choice should you write this article today :-)
It's meant as an example. The function receives something, which we call v, and returns something else, which we call e. It's not meant to be taken literally as the variable names - otherwise, you are right, e is undefined in that example.
He is just showing how the syntax of a Scheme function corresponds to the structure of a JavaScript function.
> Alonzo Church developed the lambda calculus in 1929.
His first publication that showed the elements of the lambda calculus was the 1932 paper "A set of postulates for the foundation of logic", as I cited in my recent paper [1]. It's quite possible he worked on it prior to 1932, but I don't know of any credible evidence on that (would be very interested to learn about any).
> Wait! How the heck is this a "programming" language?
> At first glance, this simple language seems to lack both recursion and iteration, not to mention numbers, booleans, conditionals, data structures and all the rest. How can this language possibly be general-purpose?
What most stops lambda calculus from being a programming language is that it doesn't directly support I/O. However, one can adopt some very simple conventions for representing bits, lists of bits (bytes), and lists of bytes, and for letting a lambda term operate on these [2] which make the so-called Binary Lambda Calculus (BLC) a programming language.
And a very expressive language it is too: a BLC self interpreter [4] can be as small as the 170-bits
01000110100001000
00001100000010111
00110000111111100
00101110011111110
00000111100000010
11101110011011110
01111111100001111
11110000101111010
01110100101111101
00101101010011010
which encodes the term
(λ11)(λ(λλλ1(λλ2(1(λ6(λ2(6(λλ3(λλ23(14))))(7(λ7(λ31(21)))))))(41(111))))(11))
Isn't this just implementing Lisp in Lisp, somehow?
Also maybe I'm just not a very good programmer, but I've never seen the point of lambda calculus. Everything written in Lisp or Scheme just seems to be an obfusc and contrarian way of writing something that would be much simpler in almost any other language.
Imagine you went back 100 years and someone was like "Come up with a mathematical system that can express any sequences of logical steps" do you imagine what you would deliver is a few primitives and a few simple rules and said "here you go!, this is fully complete". Its actually quite remarkable that Church/Turing didn't start off from primitives like if statements, loops, etc.
Lambda calculus is from the 1930s and predates computers, its point is that it is bare bones model of computation. It doesn't make much sense to compare it to modern languages in efficacy, as that seems to imagine someone came up with lambda calculus in 2010 along Java, C, Python, etc.
The super cool thing about it is that it is capable of expressing ALL the computation you know today, from the few primitives. An "If" statement for example is λb.λx.λy. b x y
One of the interesting things about the lambda calculus is its universality: by itself, it's a complete foundation for computation.
Here's a different old post of mine showing how to build the rest of the programming language, all in a miniscule subset of Python that is the pure lambda calculus:
You can even extract recursion out of the Y combinator or the more primitive U combinator -- out of nothing but lambdas!
So, it's lambdas all the way down.
Another interesting thing about the lambda calculus is that it wasn't intended to be a programming language. When Alonzo Church created it, there were no computers to program.
Alonzo Church was trying to solve problems in the foundations of mathematics.
But, untyped lambda calculus has a "bug" that makes it problematic for mathematics -- the self application that enables recursion is a problem if you're a logician who cares about soundness, but it's fantastic if you're a programmer.
I don't think of functional languages as obfuscating. I think of them as terse and expressive. They let me most directly encode the model in my head as running code.
I think what I don't get is a mental model of what they do, and how that maps onto problems I write code to solve. It's possible it's just that s-expressions are good at solving a problem I don't have and don't really understand.
You should be familiar with the terms "essential" and "accidental complexity", right?
Lambda calculus, primitive recursion, Brainf*ck (Turing machine) are all models that discard all of the accidental complexity - all of it: syntax, higher-level constructs, abstractions, etc. What's left is the bare computation itself, amenable to analysis.
It's not meant to be practical - it's a research vehicle that shows what's possible and what's not possible in the realm of ideas, without looking at any real-world constraints (you would, in practice, want to see the result of your computation in less than 100 years - that's not the consideration when using one of the models).
So yeah, it's not something you use, it's something you learn to arm your brain with tools that will help you debug and write complex code that works. It's really worth spending a year or two deep-diving into these "pure theory CS" things; considering that you're going to be writing code for 20+ years (AI allowing), spending 10% of that time learning the foundations on which everything else rests is not that big of a task.
I'm not sure the concepts of accidental vs. essential complexity really applies here. Is it really "accidental complexity" that is being removed when the result is harder to understand and less practically usable?
"Essential" here means something that cannot, in any way, be further removed from consideration. In this view, things that simplify the understanding and make the program more practical are not essential - they are add-ons on top of the most primitive computational model(s). You can drastically simplify the reading of the lambda calculus by giving it direct support for integers, for example - however, you now polluted the model with an additional element that you need to describe and always keep in mind. Using Church encoding lets you get integers without introducing any new elements to the model, keeping the model simpler - but yes, the implementations of programs within that model will get much more complex to read.
> Everything written in Lisp or Scheme just seems to be an obfusc and contrarian way of writing something that would be much simpler in almost any other language.
Probably just a "familiar with you know" thing, since me and others have the straight up opposite experience, anything not s-expressions seems needlessly obfuscated with lots of special characters and syntax all over the place, instead of just one syntax for everything, making programs so much simpler.
But you're hardly alone with that sentiment, just like I'm also not alone, so either just "brain wired that way" or something similar. The discussion is as old as the internet, highly subjective and impossible to measure, so around we go.
Everything? The full projects are in lisp syntax (s-expressions), not Algol-like syntax (C-like syntax) that most people are familiar with. Personally I mostly use Clojure, but lots of lisp-like languages offer the same (simpler) experience of programming.
With Zenlisp or any other you could basically do all the exercises as if it were a walk, making the definition of the functions self-evident as if they were bricks.
A classical factorial example in Zenlisp, as numbers
are not actual 'numbers' but Lisp lists:
(require 'nmath)
(define (fac n)
(fi '#1 '#1 n))
(define (fi a b n)
(or (and (> b n) a)
(fi
(* a b)
(+ b '#1)
n)))
(fac '#10)
'#3628800
Here '#3628800 it's just (3 6 2 8 8 0 0), as Zenlisp it's written
almost from the ground up to teach you how everything can be
built from Peano axioms or very close (set theory and the like).
After tracing the 'fi' function, this is how it works on recursive
basis:
When b hits 11, 11 is bigger than 10, so it returns 'a'. So, the factorial function: 10! = 109!, 109*8!... and so on.
Read the nmath.l and you'll discover how can you do arithmetics (and far more) with the basics. Imath.l implements complex numbers with just atoms (core units in Lisp). Rmath.l operates in the whole R domain.
Lambda calculus is a mathematical formalism that captures a lot of what we care about in terms of computation with a few primitives and some rewrite rules. It is not a programming language, by itself.
It was a "nerd" exploration few decades ago but nowadays so many of the things we do, from buying croissant to voting, is based on hardware and software. People should have a sense that yes it's complex but it's also NOT magic.
How large would implementation be in more usual languages?
http://t3x.org/lfn/
http://t3x.org/klisp/
http://t3x.org/klisp/22/
But the Lisp way it's far more elegant.
https://x.com/pchapuis/header_photo
I have been playing with an s-expr based language that compiles to f sharp, and it has made me realize how much I think Rich Hickey made some very lovely choices for clojure. I have never written clojure more than just for fun, but the more in think about my own toy language, the more highly I think of Rich Hickey. Many times because of the choices he made, but even more because of how he compromised to be able to interop with java.
function (v) { return e ; } -> Uncaught SyntaxError: Function statements require a function name
function a (v) { return e ; } a() -> Uncaught ReferenceError: e is not defined
Am I missing something?
(and that particular function can also be written v => e)
As for the e being undefined when you execute the function, you should see it as a free variable [1], supposed to be defined elsewhere or replaced with something else.
[1] https://en.wikipedia.org/wiki/Free_variables_and_bound_varia...
He is just showing how the syntax of a Scheme function corresponds to the structure of a JavaScript function.
His first publication that showed the elements of the lambda calculus was the 1932 paper "A set of postulates for the foundation of logic", as I cited in my recent paper [1]. It's quite possible he worked on it prior to 1932, but I don't know of any credible evidence on that (would be very interested to learn about any).
> Wait! How the heck is this a "programming" language? > At first glance, this simple language seems to lack both recursion and iteration, not to mention numbers, booleans, conditionals, data structures and all the rest. How can this language possibly be general-purpose?
What most stops lambda calculus from being a programming language is that it doesn't directly support I/O. However, one can adopt some very simple conventions for representing bits, lists of bits (bytes), and lists of bytes, and for letting a lambda term operate on these [2] which make the so-called Binary Lambda Calculus (BLC) a programming language.
And a very expressive language it is too: a BLC self interpreter [4] can be as small as the 170-bits
in De Bruijn notatation, with lambda diagram [3] 10 times smaller than the 7 lines of R5RS Scheme > This code will read a program from stdin, parse it, evaluate it and print the result.Except that it leaves all the actual parsing to the "read" library function.
In contrast, the BLC code does all parsing itself. One of the neatest tricks is how it represents the environment as a list built with
which allows a list of bits like "1110" (the code for de Bruijn index 3) to index the environment by simply applying it to the environment.[1] https://www.mdpi.com/1099-4300/28/5/494 "The Largest Number Representable in 64 Bits"
[2] https://gist.github.com/tromp/86b3184f852f65bfb814e3ab0987d8...
[3] https://tromp.github.io/cl/diagrams.html
[4] https://github.com/tromp/AIT/blob/master/ait/int.lam
https://www.t3x.org/clc/code.html
Also maybe I'm just not a very good programmer, but I've never seen the point of lambda calculus. Everything written in Lisp or Scheme just seems to be an obfusc and contrarian way of writing something that would be much simpler in almost any other language.
Lambda calculus is from the 1930s and predates computers, its point is that it is bare bones model of computation. It doesn't make much sense to compare it to modern languages in efficacy, as that seems to imagine someone came up with lambda calculus in 2010 along Java, C, Python, etc.
The super cool thing about it is that it is capable of expressing ALL the computation you know today, from the few primitives. An "If" statement for example is λb.λx.λy. b x y
One of the interesting things about the lambda calculus is its universality: by itself, it's a complete foundation for computation.
Here's a different old post of mine showing how to build the rest of the programming language, all in a miniscule subset of Python that is the pure lambda calculus:
https://matt.might.net/articles/python-church-y-combinator/
You can even extract recursion out of the Y combinator or the more primitive U combinator -- out of nothing but lambdas!
So, it's lambdas all the way down.
Another interesting thing about the lambda calculus is that it wasn't intended to be a programming language. When Alonzo Church created it, there were no computers to program.
Alonzo Church was trying to solve problems in the foundations of mathematics.
But, untyped lambda calculus has a "bug" that makes it problematic for mathematics -- the self application that enables recursion is a problem if you're a logician who cares about soundness, but it's fantastic if you're a programmer.
I don't think of functional languages as obfuscating. I think of them as terse and expressive. They let me most directly encode the model in my head as running code.
I think what I don't get is a mental model of what they do, and how that maps onto problems I write code to solve. It's possible it's just that s-expressions are good at solving a problem I don't have and don't really understand.
Lambda calculus, primitive recursion, Brainf*ck (Turing machine) are all models that discard all of the accidental complexity - all of it: syntax, higher-level constructs, abstractions, etc. What's left is the bare computation itself, amenable to analysis.
It's not meant to be practical - it's a research vehicle that shows what's possible and what's not possible in the realm of ideas, without looking at any real-world constraints (you would, in practice, want to see the result of your computation in less than 100 years - that's not the consideration when using one of the models).
So yeah, it's not something you use, it's something you learn to arm your brain with tools that will help you debug and write complex code that works. It's really worth spending a year or two deep-diving into these "pure theory CS" things; considering that you're going to be writing code for 20+ years (AI allowing), spending 10% of that time learning the foundations on which everything else rests is not that big of a task.
Probably just a "familiar with you know" thing, since me and others have the straight up opposite experience, anything not s-expressions seems needlessly obfuscated with lots of special characters and syntax all over the place, instead of just one syntax for everything, making programs so much simpler.
But you're hardly alone with that sentiment, just like I'm also not alone, so either just "brain wired that way" or something similar. The discussion is as old as the internet, highly subjective and impossible to measure, so around we go.
https://www.t3x.org/zsp/index.html
Half of CS and discrete math (sets, permutation...) becames self-evident with simple code.
From TCL/Tk I had nightmares with simple code with upvar.
See this book? https://discrete.openmathbooks.org/dmoi3/dmoi.html
With Zenlisp or any other you could basically do all the exercises as if it were a walk, making the definition of the functions self-evident as if they were bricks.
A classical factorial example in Zenlisp, as numbers are not actual 'numbers' but Lisp lists:
Here '#3628800 it's just (3 6 2 8 8 0 0), as Zenlisp it's written almost from the ground up to teach you how everything can be built from Peano axioms or very close (set theory and the like).After tracing the 'fi' function, this is how it works on recursive basis:
In every iteration, the input values for fi are:a = a * b
b = b +1
n = always 10.
When b hits 11, 11 is bigger than 10, so it returns 'a'. So, the factorial function: 10! = 109!, 109*8!... and so on.
Read the nmath.l and you'll discover how can you do arithmetics (and far more) with the basics. Imath.l implements complex numbers with just atoms (core units in Lisp). Rmath.l operates in the whole R domain.