11 comments

  • babarock 7 hours ago
    Matt Might's website has taught me so much over the years, going all the way back to ... I want to say 2008-2009? From programming languages to Unix-fu to a huge amount of topics in-between. I'm super glad to see his writing still being shared. One of my favorite corners of the web.
    • mattmight 5 hours ago
      Thank you for the kind words! I keep hoping I can find the time to do more writing again. And, yes, I started that blog in 2008.
  • utopiah 11 hours ago
    IMHO this and building an ALU with LEDs and logic gates should be part of ... well honestly any curriculum, even if you don't want to be study CS. Only doing that once in your life is enough to understand you could do it.

    It was a "nerd" exploration few decades ago but nowadays so many of the things we do, from buying croissant to voting, is based on hardware and software. People should have a sense that yes it's complex but it's also NOT magic.

  • RandomTeaParty 10 hours ago
    "Implement lambda calculus in languages that are pretty much lambda calculus"

    How large would implementation be in more usual languages?

    • tromp 9 hours ago
      One of the smallest implementations is my heavily obfuscated https://www.ioccc.org/2012/tromp/ :

                 Int L[A],m,b,*D=A,
                  *c,*a=L,C,*U=L,u;s
                   (_){u--&&s(a=*a);}
                    char*B,I,O;S(){b=b
                     --?b:m|read(0,&I,1
                      )-1;return~I>>b&1;
                       }k(l,u){for(;l<=u;
                        U-L<A?*U++=46^l++[
                         "-,&,,/.--/,:-,'/"
                         ".-,-,,/.-,*,//..,"
                        ]:exit(5));}p(Int*m){
                       return!*U?*m=S()?U++,!S
                      ()?m[1]=p(++U),2:3:1,p(U)
                     :S()?U+=2:p(U[1]++),U-m;}x(
                    c){k(7*!b,9);*U++=b&&S();c&&x
                   (b);}d(Int*l){--l[1]||d(l[d(*l),
                  *l=B,B=l,2]);}main(e){for(k(10,33
                 ),a[4]-=m=e-2&7,a[23]=p(U),b=0;;e-2
                ?e?e-3?s(D=a),C=a  [3],++1[a=a[2]],d(
               D):c?D=c,c=*D,*D=    a,a=D:exit(L[C+1])
              :C--<23?C=u+m&1?O      =O+O|C&1,9:write(m
             ||(O=C+28),&O,1)+        1:(S(),x(0<b++?k(0,
            6),U[-5]=96:0)):(          D=B?B:calloc(4,X))
           ?B=*D,*D=c,c=D,D[            2]=a,a[++D[1]]++,D
          [3]=++C+u:exit(6)              )e=L[C++],u=L[C];}
      
      while a less obfuscated and highly performant implementation https://github.com/tromp/AIT/blob/master/uni.c based on combinatory graph reduction takes 446 lines.
      • sph 3 hours ago
        Even the unobfuscated version is a work of art. You rarely see such concise and honestly beautiful C code.
    • nils-m-holm 8 hours ago
      Here are some starting points for exploration:

      http://t3x.org/lfn/

      http://t3x.org/klisp/

      http://t3x.org/klisp/22/

      • anthk 7 hours ago
        EForth and Subleq can be though too as a systems emerging from axioms (Subleq operations). From twenty macro instructions in EForth it can write the core Subleq code and them bootstrap itself.

        But the Lisp way it's far more elegant.

  • catwell 10 hours ago
    This has been my Twitter / X banner for at least 10 years :)

    https://x.com/pchapuis/header_photo

  • bjoli 11 hours ago
    I think it is a lovely experience just because it forces you to think about which abstractions are the correct ones. I think many people have had the feeling that they would love to change one (or many) aspects of a programming language.

    I have been playing with an s-expr based language that compiles to f sharp, and it has made me realize how much I think Rich Hickey made some very lovely choices for clojure. I have never written clojure more than just for fun, but the more in think about my own toy language, the more highly I think of Rich Hickey. Many times because of the choices he made, but even more because of how he compromised to be able to interop with java.

  • voidUpdate 10 hours ago
    > "If you've programmed in JavaScript, this form is equivalent to: function (v) { return e ; } "

    function (v) { return e ; } -> Uncaught SyntaxError: Function statements require a function name

    function a (v) { return e ; } a() -> Uncaught ReferenceError: e is not defined

    Am I missing something?

    • jraph 9 hours ago
      This is a statement vs expression thing. It works as an expression. Try:

          f = function (v) { return e ; }
      
      or

          (function (v) { return e ; })
      
      Javascript doesn't allow (nameless) function expressions (using the function keyword) where a function statement would work.

      (and that particular function can also be written v => e)

      As for the e being undefined when you execute the function, you should see it as a free variable [1], supposed to be defined elsewhere or replaced with something else.

      [1] https://en.wikipedia.org/wiki/Free_variables_and_bound_varia...

      • mattmight 5 hours ago
        Good point -- JavaScript's syntax grew up since I wrote that article!
        • jraph 2 hours ago
          Indeed! And since your article is not particularly about JS and can be read by anybody, including people not familiar with (modern) JS, I'd argue that the form you used is way clearer than the arrow function form and would probably still be a better choice should you write this article today :-)
    • emigre 10 hours ago
      It's meant as an example. The function receives something, which we call v, and returns something else, which we call e. It's not meant to be taken literally as the variable names - otherwise, you are right, e is undefined in that example.

      He is just showing how the syntax of a Scheme function corresponds to the structure of a JavaScript function.

    • ramon156 10 hours ago
      Its just an example. It doesn't matter what the function name is. And it doesn't matter where e comes from. No need to include it
  • tromp 10 hours ago
    > Alonzo Church developed the lambda calculus in 1929.

    His first publication that showed the elements of the lambda calculus was the 1932 paper "A set of postulates for the foundation of logic", as I cited in my recent paper [1]. It's quite possible he worked on it prior to 1932, but I don't know of any credible evidence on that (would be very interested to learn about any).

    > Wait! How the heck is this a "programming" language? > At first glance, this simple language seems to lack both recursion and iteration, not to mention numbers, booleans, conditionals, data structures and all the rest. How can this language possibly be general-purpose?

    What most stops lambda calculus from being a programming language is that it doesn't directly support I/O. However, one can adopt some very simple conventions for representing bits, lists of bits (bytes), and lists of bytes, and for letting a lambda term operate on these [2] which make the so-called Binary Lambda Calculus (BLC) a programming language.

    And a very expressive language it is too: a BLC self interpreter [4] can be as small as the 170-bits

        01000110100001000
        00001100000010111
        00110000111111100
        00101110011111110
        00000111100000010
        11101110011011110
        01111111100001111
        11110000101111010
        01110100101111101
        00101101010011010
    
     which encodes the term
    
        (λ11)(λ(λλλ1(λλ2(1(λ6(λ2(6(λλ3(λλ23(14))))(7(λ7(λ31(21)))))))(41(111))))(11))
    
    in De Bruijn notatation, with lambda diagram [3]

        ┬─┬ ────────────────────────────────────────────┬─┬
        └─┤ ──────┬───────────────┬──────────────────── ├─┘
          │ ──────┼───┬───────────┼─┬─────────┬──────── │
          │ ┬─────┼───┼───────────┼─┼─────────┼──────── │
          │ │ ┬───┼───┼───────────┼─┼─────────┼──────── │
          │ │ ┼─┬─┼───┼───────────┼─┼─────────┼─┬─┬─┬─┬ │
          │ │ │ │ ┼─┬─┼───────────┼─┼──────── └─┤ └─┤ │ │
          │ │ │ │ │ ┼─┼─┬─────────┼─┼─┬──────   │   ├─┘ │
          │ │ │ │ │ │ │ ┼───────┬ │ ┼─┼───┬──   ├───┘   │
          │ │ │ │ │ │ │ ┼───┬───┼ │ │ ┼─┬─┼─┬   │       │
          │ │ │ │ │ │ │ │ ┬─┼───┼ │ │ └─┤ ├─┘   │       │
          │ │ │ │ │ │ │ │ ┼─┼─┬─┼ │ │   ├─┘     │       │
          │ │ │ │ │ │ │ │ └─┤ ├─┘ │ ├───┘       │       │
          │ │ │ │ │ │ │ │   ├─┘   ├─┘           │       │
          │ │ │ │ │ │ │ ├───┘     │             │       │
          │ │ │ │ │ │ ├─┘         │             │       │
          │ │ │ │ │ └─┤           │             │       │
          │ │ │ │ │   ├───────────┘             │       │
          │ │ │ │ ├───┘                         │       │
          │ │ │ ├─┘                             │       │
          │ │ └─┤                               │       │
          │ │   ├───────────────────────────────┘       │
          │ └───┤                                       │
          │     ├───────────────────────────────────────┘
          └─────┘
    
    10 times smaller than the 7 lines of R5RS Scheme

        (define (eval e env) (cond
          ((symbol? e)       (cadr (assq e env)))
          ((eq? (car e) 'λ)  (cons e env))
          (else              (apply (eval (car e) env) (eval (cadr e) env)))))
        (define (apply f x)
          (eval (cddr (car f)) (cons (list (cadr (car f)) x) (cdr f))))
        (display (eval (read) '())) (newline)
    
    > This code will read a program from stdin, parse it, evaluate it and print the result.

    Except that it leaves all the actual parsing to the "read" library function.

    In contrast, the BLC code does all parsing itself. One of the neatest tricks is how it represents the environment as a list built with

          cons' =  \x\y\zx\zy. zx x (zy y)
    
    which allows a list of bits like "1110" (the code for de Bruijn index 3) to index the environment by simply applying it to the environment.

    [1] https://www.mdpi.com/1099-4300/28/5/494 "The Largest Number Representable in 64 Bits"

    [2] https://gist.github.com/tromp/86b3184f852f65bfb814e3ab0987d8...

    [3] https://tromp.github.io/cl/diagrams.html

    [4] https://github.com/tromp/AIT/blob/master/ait/int.lam

  • anthk 7 hours ago
  • sudhir0112 4 hours ago
    Interesting topic!
  • ErroneousBosh 5 hours ago
    Isn't this just implementing Lisp in Lisp, somehow?

    Also maybe I'm just not a very good programmer, but I've never seen the point of lambda calculus. Everything written in Lisp or Scheme just seems to be an obfusc and contrarian way of writing something that would be much simpler in almost any other language.

    • andoando 30 minutes ago
      Imagine you went back 100 years and someone was like "Come up with a mathematical system that can express any sequences of logical steps" do you imagine what you would deliver is a few primitives and a few simple rules and said "here you go!, this is fully complete". Its actually quite remarkable that Church/Turing didn't start off from primitives like if statements, loops, etc.

      Lambda calculus is from the 1930s and predates computers, its point is that it is bare bones model of computation. It doesn't make much sense to compare it to modern languages in efficacy, as that seems to imagine someone came up with lambda calculus in 2010 along Java, C, Python, etc.

      The super cool thing about it is that it is capable of expressing ALL the computation you know today, from the few primitives. An "If" statement for example is λb.λx.λy. b x y

    • mattmight 4 hours ago
      Original article author here.

      One of the interesting things about the lambda calculus is its universality: by itself, it's a complete foundation for computation.

      Here's a different old post of mine showing how to build the rest of the programming language, all in a miniscule subset of Python that is the pure lambda calculus:

      https://matt.might.net/articles/python-church-y-combinator/

      You can even extract recursion out of the Y combinator or the more primitive U combinator -- out of nothing but lambdas!

      So, it's lambdas all the way down.

      Another interesting thing about the lambda calculus is that it wasn't intended to be a programming language. When Alonzo Church created it, there were no computers to program.

      Alonzo Church was trying to solve problems in the foundations of mathematics.

      But, untyped lambda calculus has a "bug" that makes it problematic for mathematics -- the self application that enables recursion is a problem if you're a logician who cares about soundness, but it's fantastic if you're a programmer.

      I don't think of functional languages as obfuscating. I think of them as terse and expressive. They let me most directly encode the model in my head as running code.

      • ErroneousBosh 3 hours ago
        Thanks, I'll dig into that post later!

        I think what I don't get is a mental model of what they do, and how that maps onto problems I write code to solve. It's possible it's just that s-expressions are good at solving a problem I don't have and don't really understand.

    • klibertp 4 hours ago
      You should be familiar with the terms "essential" and "accidental complexity", right?

      Lambda calculus, primitive recursion, Brainf*ck (Turing machine) are all models that discard all of the accidental complexity - all of it: syntax, higher-level constructs, abstractions, etc. What's left is the bare computation itself, amenable to analysis.

      It's not meant to be practical - it's a research vehicle that shows what's possible and what's not possible in the realm of ideas, without looking at any real-world constraints (you would, in practice, want to see the result of your computation in less than 100 years - that's not the consideration when using one of the models).

      So yeah, it's not something you use, it's something you learn to arm your brain with tools that will help you debug and write complex code that works. It's really worth spending a year or two deep-diving into these "pure theory CS" things; considering that you're going to be writing code for 20+ years (AI allowing), spending 10% of that time learning the foundations on which everything else rests is not that big of a task.

      • brazzy 3 hours ago
        I'm not sure the concepts of accidental vs. essential complexity really applies here. Is it really "accidental complexity" that is being removed when the result is harder to understand and less practically usable?
        • klibertp 3 hours ago
          "Essential" here means something that cannot, in any way, be further removed from consideration. In this view, things that simplify the understanding and make the program more practical are not essential - they are add-ons on top of the most primitive computational model(s). You can drastically simplify the reading of the lambda calculus by giving it direct support for integers, for example - however, you now polluted the model with an additional element that you need to describe and always keep in mind. Using Church encoding lets you get integers without introducing any new elements to the model, keeping the model simpler - but yes, the implementations of programs within that model will get much more complex to read.
    • embedding-shape 5 hours ago
      > Everything written in Lisp or Scheme just seems to be an obfusc and contrarian way of writing something that would be much simpler in almost any other language.

      Probably just a "familiar with you know" thing, since me and others have the straight up opposite experience, anything not s-expressions seems needlessly obfuscated with lots of special characters and syntax all over the place, instead of just one syntax for everything, making programs so much simpler.

      But you're hardly alone with that sentiment, just like I'm also not alone, so either just "brain wired that way" or something similar. The discussion is as old as the internet, highly subjective and impossible to measure, so around we go.

      • ErroneousBosh 3 hours ago
        So what kind of things do you use s-expressions for? Is there something they're really good at that I don't know about yet?
        • embedding-shape 3 hours ago
          Everything? The full projects are in lisp syntax (s-expressions), not Algol-like syntax (C-like syntax) that most people are familiar with. Personally I mostly use Clojure, but lots of lisp-like languages offer the same (simpler) experience of programming.
    • anthk 2 hours ago
      Simpler? Obfuscated? Lisp can be self-defining:

      https://www.t3x.org/zsp/index.html

      Half of CS and discrete math (sets, permutation...) becames self-evident with simple code.

      From TCL/Tk I had nightmares with simple code with upvar.

      See this book? https://discrete.openmathbooks.org/dmoi3/dmoi.html

      With Zenlisp or any other you could basically do all the exercises as if it were a walk, making the definition of the functions self-evident as if they were bricks.

      A classical factorial example in Zenlisp, as numbers are not actual 'numbers' but Lisp lists:

          (require 'nmath)
      
          (define (fac n)
            (fi '#1 '#1 n))
      
          (define  (fi a b n)
            (or (and (> b n) a)
             (fi
              (* a b)
              (+ b '#1)
              n)))
      
           (fac '#10)
           '#3628800
      
      Here '#3628800 it's just (3 6 2 8 8 0 0), as Zenlisp it's written almost from the ground up to teach you how everything can be built from Peano axioms or very close (set theory and the like).

      After tracing the 'fi' function, this is how it works on recursive basis:

          + (fi #1 #1 #10)
          + (fi #1 #2 #10)
          + (fi #2 #3 #10)
          + (fi #6 #4 #10)
          + (fi #24 #5 #10)
          + (fi #120 #6 #10)
          + (fi #720 #7 #10)
          + (fi #5040 #8 #10)
          + (fi #40320 #9 #10)
          + (fi #362880 #10 #10)
         + (fi #3628800 #11 #10)
      
      
      In every iteration, the input values for fi are:

      a = a * b

      b = b +1

      n = always 10.

      When b hits 11, 11 is bigger than 10, so it returns 'a'. So, the factorial function: 10! = 109!, 109*8!... and so on.

      Read the nmath.l and you'll discover how can you do arithmetics (and far more) with the basics. Imath.l implements complex numbers with just atoms (core units in Lisp). Rmath.l operates in the whole R domain.

    • empath75 3 hours ago
      Lambda calculus is a mathematical formalism that captures a lot of what we care about in terms of computation with a few primitives and some rewrite rules. It is not a programming language, by itself.
      • t0mpr1c3 37 minutes ago
        It's Turing complete. What else do you want?
  • immanuwell 10 hours ago
    [dead]