I find multiple "strange" flaws with the article, even for my appreciation of Ada _and_ the article as an essay:
* The article claims only Ada has true separation of implementation vs specification (the interface), but as far as I am able to reason, also e.g. JavaScript is perfectly able to define "private" elements (not exported by an ES6 module) while being usable in the module that declares them -- if this isn't "syntactical" (and semantical) separation like what is prescribed to Ada, what is the difference(s) the article tries to point out?
* Similarly, Java is mentioned where `private` apparently (according to the article) makes the declaration "visible to inheritance, to reflection, and to the compiler itself when it checks subclass compatibility" -- all of which is false if I remember my Java correctly -- a private declaration is _not_ visible to inheritance and consequently the compiler can ignore it / fast-track in a subclass since it works much the same as it has, in the superclass, making the "compatibility" a guarantee by much the same consequence
I am still reading the article, but having discovered the above points, it detracts from my taking it as seriously as I set out to -- wanting to identify value in Ada that we "may have missed" -- a view the article very much wants to front.
Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.
Ada’s failure to escape its niche is overdetermined.
Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.
And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.
The first validated compiler for Ada that ran on the IBM PC was released in 1983.
The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.
I think more research is warranted by you on this subject.
I’m not saying it wasn’t possible, I’m saying the larger ecosystem was never going to embrace a language that was as heavyweight as Ada. In 1983, most PC system software was written in assembly!
I sometimes wonder what "Turbo Ada" would have looked like, but I think it would have probably looked like later versions of Borland Pascal. Things like generics and exceptions would have taken some of the "turbo" out of the compiler and runtime -- the code generator didn't even get a non-peephole optimizer until 32-bit Delphi, it would have been too slow.
It might be nice to have Ada's tasks driven by DOS interrupts, though. I think GNAT did this.
I have not seen it, but there is something close to what you ask about: Turbo Modula-2 (an implementation of MODULA-2 written by Martin Odersky), as both MODULA-2 and PASCAL were Niklaus Wirth-invented languages that looks very similar to Ada:
"Shortly before we finished our compiler, Borland came out with Turbo Pascal, and they were considering going into the Modula-2 market as well. In fact, Borland decided to buy our Modula-2 compiler to be sold under the name of Turbo Modula-2 for CP/M alongside an IBM PC version they wanted to develop. We offered to do the IBM PC version for them, but they told us they had it already covered. Unfortunately that version took them much longer than planned. By the time it came out, three or four years later, their implementor team had split from the company, and it became known as TopSpeed Modula-2. In the absence of an IBM-PC version, Borland never put any marketing muscle behind Turbo-Modula-2, so it remained rather obscure."
-- https://www.artima.com/articles/the-origins-of-scala
I’ve never directly played with Ada but my understanding is that it was very much both.
Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).
These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.
Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.
I had to take some course that was something like "Programming Language Theory". As a result I had to look at the specifications for dozens of different programming languages. I remember looking at the features of some languages and scratching my head trying to figure out how some of this would ever be practically implemented by a compiler. Later on I found out lots of stuff is just implemented by a runtime anyways, which lead to me realize that those fancy language features are often better as a library.
I took a course exactly like that. I wonder if we went to the same school, or it’s due to curriculum standardization. The professor was particularly enthusiastic about Ada, so I had assumed the course was largely his creation.
A huge factor. I used ada for years and the fact everyone I worked with did hobby projects in other languages didn’t help it. And most of us liked Ada.
It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.
Given some of the other issues, I’m not sure it would have mattered, but it certainly didn’t even allow the experiment to be run. I would not have wanted to compile Ada in the 1980s on that hardware. Given all the checking, the compiler must have been horribly slow (imagine compiling Rust on that same 1980s hardware).
I was student between 1990 and 1993 and Ada was the main language. Compilation speed was not an issue. I remember that Eiffel was very slow to compile, but not Ada. Between 1994 and 1999, I have worked with Ada on Vax machines. The full recompilation took 2 hours because the machine was slow, not because of the language. Other languages were similarly slow (pascal, C). C was slow because of the lack of precompiled headers (many headers had to be parsed many times). With Ada (alsys ada), there were "libraries" that were black boxes directories containing object code and already parsed package specifications.
Between 1999 and 2002, I have handled projects in Ada, C++ and Java. C++ was slightly slower than Ada (slow link). Java was a lots faster.
Nowadays, Ada compilation is faster than C++.
Let's just be honest that even if there was a free compiler in 1985 or earlier, there's no way that e.g. someone like Linus Torvalds or an RMS etc would have written various groundbreaking pieces of software on Ada. It was just in an entirely different headspace.
I was around then, and culturally there just wasn't this (legitimate) concern with safety in the more "hacker" and Unix community generally. C won headspace at the time precisely because it was minimal and close to the metal while providing the minimum of abstraction people wanted. Which was on the whole fine because the blast radius for mistakes was lower and the machines were simpler.
> while providing the minimum of abstraction people wanted
Yes, I think this is key. I wasn't around in 1985, but on every attempt to write something in Ada I've found myself fighting its standard library more than using it. Ada's stdlib is an intersection of common features found in previous century's operating systems, and anything OS-specific or any developments from the last 30 years seem to be conspicuously absent. That wouldn't be so much of a problem if you could just extend the stdlib with OS-specific features, but Ada's abstractions are closed instead of leaky.
I'm sure that this is less of a problem on embedded systems, unikernels or other close-to-hardware software projects where you have more control over the stdlib and runtime, but as much as I like Ada's type system and its tasking model, I would never write system applications in Ada because the standard library abstractions just get in the way.
To illustrate what I mean, look at the Ada.Interrupts standard library package [0] for interrupt handling, and how it defines an interrupt handler:
type Parameterless_Handler is
access protected procedure
with Nonblocking => False;
That's sufficient for hardware interrupts, as that's pretty much how a hardware interrupt is instrumented: you have an entry point address, and that's it. But on Linux the same package is used for signal handling, and a parameterless procedure is in no way compatible with the rich siginfo_t struct that the kernel offers. Moreover, because the handler is parameterless you need to attach a separate handler to each signal to even know which signal was raised. To add insult to injury, the gnat runtime always spawns a signal handler thread with an empty sigprocmask before entering the main subprogram, so it's not possible to use signalfd to work around this issue either.
Ada's stdlib file operations suffer from closed enumerations: the file operations Create and Open take a File_Mode argument, and that argument is defined as [1]:
type File_Mode is (In_File, Inout_File, Out_File); -- for Direct_IO
type File_Mode is (In_File, Out_File, Append_File); -- for Stream_IO
That's it. No provisions for Posix flags like O_CLOEXEC or O_EXCL nor BSD flags like O_EXLOCK, and since enum types are closed in Ada there is no way to add those custom flags either. All modern or OS-specific features like dirfd on Linux or opportunistic locking on Windows are not easily available in Ada because of closed definitions like this.
Another example is GNAT.Sockets (not part of Ada stdlib), which defines these address families and socket types in a closed enum:
type Family_Type is (Family_Inet, Family_Inet6, Family_Unix, Family_Unspec);
type Mode_Type is (Socket_Stream, Socket_Datagram, Socket_Raw);
Want to use AF_ALG or AF_KEY for secure cryptographic operations, or perhaps SOCK_SEQPACKET or a SOL_BLUETOOTH socket? Better prepare to write your own Ada sockets library first.
Not really, the state of compilers pretty much sucked back then. GCC was the only real free compiler in the 80s and it wasn't really ready for prime time until the late 80s. You were paying (lots) of money for a compiler no matter what language you chose. And if you were targeting a new language the compiler was sure to suck.
Even in the late 90s Jamie Zawinski had a rant against C++. His argument for not using it? The compilers suck! C++ was the main "competitor" of Ada and it was a decade or more behind Ada through most of the time.
The "killer feature" of C++ against Ada (when it came to fighting against compiler maturity) was really that you could pretend to be writing C++ code but really just keep writing C-with-classes.
If Ada had put a modula or pascal compatibility mode in the language and produced a reference compiler that was based on a stable compiler in one of those languages, the history may have been different because people could have just written "PascAda" while waiting for the compilers to catch up.
GNAT has existed since at least the mid-90s, and in that time period plenty of companies used non-OSS compilers.
In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.
True, but at that time it was already too late. C/C++ had won.
Moreover, for a very long time GNAT had been quite difficult to build, configure and coexist with other gcc-based compilers, far more difficult than building and configuring the tool chain for any other programming language. (i.e. you could fail to get a working environment, without any easy way to discover what went wrong, which never happened with any other programming language supported by gcc)
I have no idea which was the reason for this, because whichever was the reason it had nothing to do with any intrinsic property of the language.
I do not remember when it has finally become easy to use Ada with gcc, but this might have happened only a decade ago, or even more recently.
I always found it funny when Rust came about, I can't help but feel like, and maybe I'm misremembering when I deep dove Ada the first time, Ada was our first "Rust" like language, maybe Delphi / Pascal is the only other really close one that became mainstream enough before Rust did?
Rust emerged from the language enthusiast community not a formal industry committee and in some ways that was its superpower.
I and many others have looked at Ada with some appreciation for decades. But the actual "community" around the language was foreign to me; government, defense contractors, etc places that frankly wouldn't even hire me.
It's got appealing constructs, and I grew up with the Wirth languages so I wans't put off by its syntax and style... and I even sometimes considered rewriting my OSS C++ pieces in it because I was so desperate for something better. But it was just a self-limiting box.
I agree, as someone who is fascinated by it. I worked for a defense contracting company, and no even they used it. It's such a strange gem of a language, so much potential lost.
The claim that it was designed for Ada was just marketing hype, like the attempt of today of selling processors "designed for AI".
The concept of iAPX 432 had been finalized before Ada won the Department of Defense competition.
iAPX 432 was designed based on the idea that such an architecture would be more suitable for high level languages, without having at that time Ada or any other specific language in mind.
The iAPX designers thought that the most important feature that would make the processor better suited for high-level languages would be to not allow the direct addressing of memory but to control the memory accesses in such a way that would prevent any accesses outside the intended memory object.
The designers have made many other mistakes, but an important mistake was that the object-based memory-access control that they implemented was far too complex in comparison with what could be implemented efficiently in the available technology. Thus they could not implement everything in one chip and they had to split the CPU in multiple chips, which created additional challenges.
Eventually, the "32-bit" iAPX432 was much slower than the 16-bit 80286, despite the fact that 80286 had also been contaminated by the ideas of 432, so it had a much too complicated memory protection mechanism, which has never been fully used in any relevant commercial product, being replaced by the much simpler paged memory of 80386.
The failure of 432 and the partial failure of 286 (a very large part of the chip implemented features that have never been used in IBM PC/AT and compatibles) are not failures of Ada, but failures of a plan to provide complex memory access protections in hardware, instead of simpler methods based on page access rights and/or comparisons with access limits under software control.
Now there are attempts to move again some parts of the memory access control to hardware, like ARM Cheri, but I do not like them. I prefer simpler methods, like the conditional traps of IBM POWER, which allow a cheaper checking of out-of-bounds accesses without any of the disadvantages of the approaches like Cheri, which need special pointers, which consume resources permanently, not only where they are needed.
The 286 worked perfectly fine. If you take a 16-bit unix and you run it on a 286 with enough memory then it runs fine.
Where it went wrong is in two areas: 1) as far as I know the 286 does not correct restart all instruction if they reference a segment that is not present. So swapping doesn't really work as well as people would like.
The big problem however was that in the PC market, 808[68] applications had access to all (at most 640 KB) memory. Compilers (including C compilers) had "far" pointers, etc. that would allow programs to use more than 64 KB memory. There was no easy way to do this in 286 protected mode. Also because a lot of programs where essentially written for CP/M. Microsoft and IBM started working on OS/2 but progress was slow enough that soon the 386 became available.
The 386 of course had the complete 286 architecture, which was also extended to 32-bit. Even when flat memory is used through paging, segments have to be configured.
When I was a young lad, must have been 20 I came across some programming books, including programming in Ada.
I read so much of it but never wrote a line of code in it, despite trying. Couldn't get the build environment to work.
But the idea of contracts in that way seemed so logical. I didn't understand the difference this article underpins though. I learned Java and thought interfaces were the same.
I like the article overall but the continually repeated 'Language X didn't have that until <YEAR>' is very grating after the first ten or so.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
This is not a writeup of "Ada is better than everything else". The author is explaining how Ada achieved safety/reliability goals that your favorite language independently evolved much later on. That is why they kept bringing up year-of-arrival for comparison.
Examples would be a nice bonus but I think the author eschewed such because they weren't interested in writing a tutorial. They had a very specific point to make and stuck to it, resulting in a very informative but concise article that reads well because of its highly disciplined authorship.
You could do the same in reverse as well. Many of the features listed in the first paragraph existed before in other languages, though probably not all of them in a single language. In fact, I believe the design process (sensibly) favored best practices of existing languages rather than completely new and unproven mechanisms.
So there was considerable borrowing from PASCAL, CLU, MODULA(-2), CSP. It's possible that the elaborate system for specifying machine representations of numbers was truly novel, but I'm not sure how much of a success that was.
There are features common to Ada and Modula, but those have been taken by both languages from Xerox Mesa.
The first version of Modula was designed with the explicit goal of making a simple small language that provided a part of the features of Xerox Mesa (including modules), after Wirth had spent a sabbatical year at Xerox.
Nowadays Modula and its descendants are better known than Mesa, because Wirth and others have written some good books about it and because Modula-2 was briefly widely available for some microcomputers. Many decades ago, I had a pair of UVPROM memories (i.e. for a 16-bit data bus) that contained a Modula-2 compiler for Motorola MC68000 CPUs, so I could use a computer with such a CPU for programming in Modula-2 in the same manner how many early PCs could be used with their built-in BASIC interpreter. However, after switching to an IBM PC/AT compatible PC, I have not used the language again.
However, Xerox Mesa was a much superior language and its importance in the history of programming languages is much greater than that of Modula and its derivatives.
Ada has taken a few features from Pascal, but while those features were first implemented in Pascal, they had been proposed much earlier by others, e.g. the enumerated types of Pascal and Ada had been first proposed by Hoare in 1965.
When CLU is mentioned, usually Alphard must also be mentioned, as those were 2 quasi-simultaneous projects at different universities that had the purpose of developing programming languages with abstract data types. Many features have appeared first in one of those languages and then they have been introduced in the other after a short delay. Among the features of modern programming languages that come from CLU and Alphard are for-each loops and iterators.
Mesa was my first language that I used out of Collage for the seven years that I worked on the Xerox Star document editor. The job where I learned more in 6 months than I did in 4 years of collage or my entire working career afterwords.
It was by far the best language that I used for my entire working career where I had to endure such languages as PL/1 (and PL/S), C, C++, Java, JavaScript and PHP. While Java as a lang was not too bad it still paled in features and usability compared to MESA and it too was influenced by MESA.
But as was true at Xerox was it was the complete network that was revolutionary at the time in the early 80’s. The fact that I could source debug any machine remotely on the corporate would wide network of over 5000 machine and that the source code would be automatically done loaded to my machine (mean I could easily debug from any nearby random machine) was just something I could never “easily’ do elsewhere.
MESA was missing a few things (which CEDAR solved and used generally within only Xerox PARC partially because at the time it really only ran on Dorado class machine) such as Garbage collection and in the case of Star it would have been much better if the language supported OOP. For Star we had had a system called Traits to support objects but it had some serious issues IMHO (which would be fodder for a separate post.)
When talking about Mesa you also need to talk about Tajo, its development environment built onto- of the OS Pilot (Star also used Pilot.) But systems also supported a mouse and a large bitmapped monitor and had overlapping windows (although most of Star have automatic non overlapping windows that was a UI usability decision.)
There is also more because the network was very important. Print severs, file servers, mail servers, cloudless store for all of Star’s user files/desktop. All this in the very early 80’s was unheard of elsewhere. It’s very similarly to what Steve Jobs missed when he saw Smalltalk where he only really saw a new UI and missed much more that was demoed.
It was a magic place to work at the time, I had left in the very late 80s for Apple and it was a huge step backwards at the time (but did amazing stuff with their limited tools but made working not fun.)
Some information may also exist in other subdirectories of "pdf/xerox".
There have been many references to Mesa in the research articles and the books published towards the end of the seventies and during the eighties, but those are hard to find today, as most of them may have not been digitized. Even if they were digitized, it is hard to search through them to find the relevant documents, because you would not know from the title whether Mesa is also discussed along with other programming languages.
In general, bitsavers.org is probably the most useful Internet resource about old computing hardware and software, because no secondary literature matches the original manuals of the computer vendors, which in the distant past had an excellent quality, unlike today.
Ada provides many features of Mesa, but not all of them and I regret that some Mesa features are missing from the languages that are popular today.
The loops with double exits of Python (i.e. with "else") have been inspired by Mesa, but they provide only a small subset of the features available in Mesa loops.
I prompted Claude for some demo code to appreciate the language and it did a good job. Definitely some pretty neat stuff in there that exposed some unrealized FOMO. Of course I knew of Ada for decades, but I never got into it.
"JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers."
Then:
"in Ada, the implementation of a private type is not merely inaccessible, it is syntactically absent from the client's view of the world."
Am I missing something -- a JavaScript module is perfectly able to declare a private element by simply not exporting it, accomplishing what the author prescribes to Ada as "is not merely inaccessible, it is syntactically absent from the client's view of the world"? Same would go for some of the other language author somewhat carelessly lumps together with JavaScript.
I loved the article, and I have always had curiosity about Ada -- beyond some of the more modern languages in fact -- but I just don't see where Ada separates interface from implementation in a manner that's distinctly better or different from e.g. JavaScript modules.
Assuming we’re talking about TypeScript here, because JavaScript doesn’t have exportable types… Any instance in JavaScript, whether or not its type is exported, is just an object like any other, that any other module is free to enumerate and mess with once it receives it. In Ada there are no operations on an instance of a private type except the ones provided by the source module.
In other words, if module X returns a value x of unexported type T to module Y, code in module Y is free to do x.foo = 42.
To preempt the obvious: yes, I know _everything_ (nearly) in JavaScript is an object, but a module exporting a `Function` can expect the caller to use the function, not enumerate it for methods. And the function can use a declaration in the module that wasn't exported, with the caller none the wiser about it.
The US Air Force intended to use ADA, but had to use JOVIAL instead because ADA took so long to be developed. Most people have never heard of JOVIAL but it still exists in the USAF as a legacy.
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
JOVIAL had been in use within the US Air Force for more than a decade before the first initiative for designing a unique military programming language, which has resulted in Ada.
JOVIAL had been derived from IAL (December 1958), the predecessor of ALGOL 60. However JOVIAL was defined before the final version of ALGOL 60 (May 1960), so it did not incorporate a part of the changes that had occurred between IAL and ALGOL 60.
The timeline of Ada development has been marked by increasingly specific documents elaborated by anonymous employees of the Department of Defense, containing requirements that had to be satisfied by the competing programming language designs:
1975-04: the STRAWMAN requirements
1975-08: the WOODENMAN requirements
1976-01: the TINMAN requirements
1977-01: the IRONMAN requirements
1977-07: the IRONMAN requirements (revised)
1978-06: the STEELMAN requirements
1979-06: "Preliminary Ada Reference Manual" (after winning the competition)
Already the STRAWMAN requirements from 1975 contained some features taken from JOVIAL, which the US Air Force used and liked, so they wanted that the replacement language should continue to have them.
However, starting with the IRONMAN requirements, some features originally taken as such from JOVIAL have been replaced by greatly improved original features, e.g. the function parameters specified as in JOVIAL have been replaced by the requirement to specify the behavior of the parameters regardless of their implementation by the compiler, i.e. the programmer specifies behaviors like "in", "out" and "in/out" and the compiler chooses freely how to pass the parameters, e.g. by value or by reference, depending on which method is more efficient.
This is a huge improvement over how parameters are specified in languages like C or C++ and in all their descendants. The most important defects of C++, which have caused low performance for several decades and which are responsible for much of the current complexity of C++ have as their cause the inability of C++ to distinguish between "out" parameters and "in/out" parameters. This misfeature is the reason for the existence of a lot of unnecessary things in C++, like constructors as something different from normal functions, and which cannot signal errors otherwise than by exceptions, of copy constructors different from assignment, of the "move" semantics introduced in C++ 2011 to solve the performance problems that plagued C++ previously, etc.
There's a lot of autodidact researchers out there that existed before AI, and AI-assisted, their research and output has sped up significantly. To call them a bot is to mildly put, miss the forest for the trees.
My work on DoD ADA projects tended to focus on DoD STD 2167 (mid to late 1980s).
Sadly the review meetings focused on document structure instead of thoughtful software design and analysis. ADA didn't help; it was cumbersome to get working well, and ADA experience in the contracting agencies was low. The waterfall approach made the projects slow to implement.
The evidence is that the article’s writing is terrible. It repeats the same rhetorical devices over and over, dressing up a series of facts in false profundity, because there’s no actual authorial insight here. It’s just “write a well-researched article that demonstrates how ahead of its time the Ada language was” + matmul.
Neither of those standards are what I’m talking about.
Obviously this article was highly pleasing to the hn audience as it’s currently sitting at #1. It’s still garbage, because it doesn’t have any interesting ideas behind it.
Certainly not commensurate with its length.
I really don't want this to be AI writing because I enjoyed it, but as other commenters have pointed out, the rate of publishing (according to the linked Twitter account) is very rapid. I'm worried that I can't tell.
That does look a little suspicious. There do exist AI-based tools now that can take other people's blogs and rewrite them with other words. Those are all the rage over on Reddit subs on blogging for ad revenue ...
>the rate of publishing (according to the linked Twitter account) is very rapid.
I've written almost 50 blog posts in the last 3 years. All in draft, never published mostly because a crippling imposter syndrome and fear of criticism. But every now and then I wake up full of confidence and think "this is it. today I'll click publish I don't give a fuck. All in". Never happens. Maybe this author was in the same boat until a month ago. I know there's a high chance that's just a bot but I can understand if it's not and how devastating has to be to overcome the fear of showing your thoughts to the world and being labeled a bot. If it's not already obvious English is not my first language and I've used LLMs to check my grammar and improve the style. Maybe all my posts smell like chatpgt now and this just adds to the fear of being dismissed as slop.
LLMs do not currently improve the style of typical HN writing. Maybe someday they will; this article is less painfully bad than those of a few months ago.
The main problem with this article is that it appears to have been basically written out of whole cloth by the LLM, there’s no novel insight here about Ada beyond what you could fit in a short prompt + the Wikipedia article.
I think so. Who writes something and why are important context for what we do with the information. It's an issue with the lack of disclosure, not AI in general.
Most longform readers will assume an author has deep expertise and spent a lot of time organizing their thoughts, which lends their ideas some legitimacy and trust. For a small blog, an 8,000 word essay is a passion project.
But if AI is detected in the phrasing and not disclosed, it begs a lot of questions. Did AI write the whole thing, or just light edits? Are the facts AI generated, too, and not from personal experience? What motivated someone to produce this content if they were going to automate parts of its creation; why would they value the output more than the process?
imo, the real value of Ada/SPARK today is that it enforces a clear split between specification and implementation, which is exactly what your LLM needs.
You define the interface, types, pre/post conditions you want in .ads file, then let the agent loose writing the .adb body file. The language’s focus on readability means your agent has no problem reading and cross referencing specs. The compiler and proof tools verify the body implements the spec.
> The verbosity was deliberate — Ichbiah wanted programs to be readable by people other than their authors, and readability over time favours explicitness — but it was experienced as bureaucratic and un-hacker-like, and the programming culture that formed in the 1980s and 1990s was organised around the proposition that conciseness was sophistication. Ada was the language of procurement officers. C was the language of people who understood machines. The cultural verdict was delivered early and never substantially revisited.
Not really. That was written by someone who doesn't really know the language and is writing from a position of hearsay.
Ada is "verbose" in that it has fairly rigorous type specification. It was verbose in comparison to languages that had weak or primitive typing. A lot of the "bureaucracy" in the language is being very specific about types to catch bugs.
Ada 83 did have a problem in that it lacked [interfaces]. This could sometimes limit code reuse.
Ada was designed to be "readable" but so was Pascal and many other languages (and, more recently for instance, Python). "Readability" in those days mainly meant preferring keywords over operators and allowing for infix notation with proper order-of-operations.
I misspoke I actually meant interfaces. Ada forbade multiple inheritance to avoid the diamond problem but Java implemented interfaces as a solution and Ada adopted them.
I remember learning ADA at uni in the 90s and not loving it because of the syntax and it being slow to work with. I also remember the Arianne 5 rocket crash in the late 90s being blamed for a software bug, and the software being written in ADA. Now i understand that it was not a pure software issue, but still, all that safety did not prevent the major disaster that it was
The Arianne 5 crash was caused by re-using a module from the Arianne 4 in the new rocket without testing. Management declared the module to be "proven" but it was only designed and proven within the flight envelope of the Arianne 4.
The C people tried to blame the crash on Ada's use of exceptions. At the time, exceptions were controversial. The actual crash came after an exception was fired and the C folks insisted that C would have just ignored the error state and carried on. Except that the exception was actually the software manifestation of a hardware signal that would have crashed a C program as well.
Ada had a lot of haters, mainly because it was imposed top-down in a lot of organizations. But also because there was a lot of money behind C and other technologies. C++ was vaporware at the time and was able to promise to be a better version of everything Ada was (just you wait!).
Personally Ada was the coolest language I've ever learned and I still love playing with it.
The origin of all sum types is in "Definition of new data types in ALGOL x", published by John McCarthy in October 1964, who introduced the keyword UNION for such types (he proposed "union" for sum types, "cartesian" for product types, and also operator overloading for custom types).
John McCarthy, the creator of LISP, had also many major contributions to ALGOL 60 and to its successors (e.g. he introduced recursive functions in ALGOL 60, which was a major difference between ALGOL 60 and most existing languages at that time, requiring the use of a stack for the local variables, while most previous languages used only statically-allocated variables).
The "union" of McCarthy and of the languages derived from his proposal is not the "union" of the C language, which has used the McCarthy keyword, but with the behavior of FORTRAN "EQUIVALENCE".
The concept of "union" as proposed by McCarthy was first implemented in the language ALGOL 68, then, as you mention, some functional languages, like Hope and Miranda, have used it extensively, with different syntactic variations.
Definitely if you don't have the C "union" user defined type you should use this keyword for your sum types. Many languages don't have this feature - which is an extremely sharp blade intended only for experts - and that's fine. You don't need an Abrams tank to take the kids to school, beginners should not learn to fly in the F-35A and the language for writing your CRUD app does not need C-style unions.
If Rust didn't have (C-style) unions then its enum should be named union instead. But it does, so they needed a different name. As we work our way through the rough edges of Rust maybe this will stick up more and annoy me, but given Rust 1.95 just finally stabilized core::range::RangeInclusive, the fix for the wonky wheel that is core::ops::RangeInclusive we're not going to get there any time soon.
I've written a few small projects in Ada, and it's a better language than it gets credit for.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
Ada is a language that had a lot of useful features much earlier than any of the languages that are popular today, and some of those features are still missing from the languages easily available today.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
Verbosity is a feature not a bug. Programming is a human activity and thus should use human language and avoid encoded forms that require decoding to understand. The use of abbreviations should be avoided as it obsfucates the meaning and purpose of code from a reader.
The programming community is strongly divided between those who believe that verbosity is a feature and not a bug and those who believe that verbosity is a bug and not a feature.
A reconciliation between these 2 camps appears impossible. Therefore I think that the ideal programming language should admit 2 equivalent representations, to satisfy both kinds of people.
The pro-verbose camp argues that they cannot remember many different symbols, so they prefer long texts using keywords resembling a natural language.
The anti-verbose camp, to which I belong, argues that they can remember mathematical symbols and other such symbols, and that for them it is much more important to see on a screen an amount of program as big as possible, to avoid the need of moving back and forth through the source text.
Both camps claim that what they support is the way to make the easiest to read source programs, and this must indeed be true for themselves.
So it seems that it is impossible to choose rules that can ensure the best readability for all program readers or maintainers.
My opinion is that source programs must not be stored and edited as text, but as abstract syntax trees. The program source editors and viewers should implement multiple kinds of views for the same source program, according to the taste of the user.
It is not that I cannot remember the symbols - I don't want to; I want the language to plainly explain itself to me. Furthermore every language has it's own set of unique symbols. For new readers to a language you first have to familiarize yourself with the new symbols. I remember my first few times reading rust... It still makes my head spin. I had to keep looking up what everything did. If the plain keyword doesn't directly tell you what it's doing at least it hints at it.
To be clear Ada specifically talks about all this in the Ada reference manual in the Introduction. It was specifically designed for readers as opposed to writers for very good reasons and it explains why. It's exactly one of the features other languages will eventually learn they need and will independently "discover" some number of years in the future.
Rust has a complex semantics, not a complicated syntax. The syntax was explicitly chosen to be quite C/C++ like while streamlining some aspects of it (e.g. the terrible type-ascription syntax, replaced with `let name: type`).
I agree that the use of symbols becomes a problem when you use many programming languages and each of them uses different symbols.
This has never been solved, but it could have been solved if there would have been a standard about the use of symbols in programming languages and all languages would have followed it.
Nevertheless, for some symbols this problem does not arise, e.g. when traditional mathematical symbols are used, which are now available in Unicode.
Many such symbols have been used for centuries and I hate their replacements that had to be chosen due to the constraints of the ASCII character set.
Some of the APL symbols are straightforward extensions of the traditional mathematical notation, so their use also makes sense.
Besides the use of mathematical symbols in expressions, classic or Iverson, the part where I most intensely want symbols, not keywords, is for the various kind of statement brackets.
I consider the use of a single kind of statement brackets as being very wrong for program readability. This was introduced in ALGOL 60 (December 1958) as the pair "begin" and "end". Other languages have followed ALGOL 60. CPl has replaced the statement brackets with paragraph symbols (August 1963), and then the language B (the predecessor of C) has transitioned to ASCII so it has replaced the CPL symbols with curly braces, sometimes around 1970.
A better syntax was introduced by ALGOL 68, which is frequently referred to as "fully bracketed syntax".
In such a syntax different kinds of brackets are used for distinct kinds of program structures, e.g. for blocks, for loops and for conditional structures. This kind of syntax can avoid any ambiguities and it also leads to a total number of separators, parentheses, brackets and braces that is lower than in C and similar languages, despite being "fully bracketed". (For instance in C you must write "while (condition) {statements;}" with 6 syntactic tokens, while in a fully bracketed language you would write "while condition do statements done", with only 3 syntactic tokens)
If you use a fully bracketed syntax, the number of syntactic tokens is actually the smallest that ensures a non-ambiguous grammar, but if the tokens are keywords the language can still appear as too verbose.
The verbosity can be reduced a lot if you use different kinds of brackets provided by Unicode, instead of using bracket pairs like "if"/"end if", "loop"/"end loop" or the like.
For instance, one can use curly braces for blocks, angle brackets for conditional expressions or statements, double angle brackets for switch/case, bag delimiters for loops, and so on. One could choose to use different kinds of brackets for inner blocks and for function bodies, and also different kinds of brackets for type definitions.
In my opinion, the use of many different kinds of brackets is the main feature that can reduce program verbosity in comparison with something like Ada.
Moreover, the use of many kinds of brackets is pretty much self describing, like also in HTML or XML. When you see the opening bracket, you can usually recognize what kind of pattern starts, e.g. that it is a function body, a loop, a block, a conditional structure etc., and you also know how the corresponding closing bracket will look. Thus, when you see a closing bracket of the correct shape you can know what it ends, even when you had not known previously the assignment between different kinds of brackets and different kinds of program structures.
In languages like C, it is frequently annoying when you see many closing braces and you do not know what they terminate. Your editor will find the matching brace, but that wastes precious time. You can comment the closing braces, but that becomes much more verbose than even Ada.
So for me the better solution is to use graphically-distinct brackets. Unicode provides many suitable bracket pairs. There are programming fonts, like JetBrains Mono, which provide many Unicode mathematical symbols and bracket pairs.
When I program for myself, I use such symbols and I use a text preprocessor before passing the program to a compiler.
I agree. I've never understood or accepted the claim that Ada is verbose. It's simply clear and expressive. If there were some alternative concise syntax for "Ada" then I would not want to use it (because it would not be Ada).
Because that is a joke, it proposes replacements only for a small set of Ada tokens and it is not clear how the proposal can be cleanly extended to the full set of Ada tokens.
Nevertheless in is possible to define a complete 1 to 1 mapping of all Ada syntactic tokens to a different set of tokens.
The resulting language will have exactly the same abstract syntax as Ada, so it is definitely exactly the same language, only with a different appearance.
For a seasoned Ada programmer, changing the appearance of the language may be repugnant, but for a newbie there may be no difference between two alternative sets of tokens, especially when the programmers are not native English speakers, so they do not feel any particular loyalty to words like "begin" and "loop", so they may not feel any advantage of using them instead of using some kind of brackets that would replace them.
I think there is a significant difference between choosing to use words (from some language) versus using brackets like {}, () and []. With nested brackets there are often debates over placement and it is usually less clear what scope is being ended by the closing bracket.
Verbosity is a feature for small self-contained programs, and a bug for everything else. As long as you're using recognizable mnemonics and not just ASCII line noise or weird unreadable runes (as with APL) terseness is no obstacle at all for a good programmer.
> Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada
I don’t think you really understand what you’re saying here. I have worked on an ada compiler for the best part of a decade. It’s one of the most complex languages there is, up there with C++ and C#, and probably rust
Mind you, that suggests that the sentence is at least half-true even if "much more complex" is a big overstatement, since Rust, "modern" C++ and the later evolutions of C# are all relatively recent. (What would have compared to Ada in complexity back in the day? Common Lisp, Algol 68?)
As a matter of general interest, what features or elements of Ada make it particularly hard to compile, or compile well? (And are there parts which look like they might be difficult to manage but aren't?)
what do you mean under Ada's complexity?
E.g. C++ is really complex because of a lot of features which badly interoperate between themselves.
Is this true for the Ada lang/compiler? Or do you mean the whole complexity of ideas included in Ada - like proof of Poincaré conjecture complex for unprepared person.
The hardware description languages, even if they have a single language specification, are divided into 2 distinct subsets, one used for synthesis, i.e. for hardware design, and one used for simulation, i.e. for hardware verification.
The subset required for hardware synthesis/design, cannot be unified completely with a programming language, because it needs a different semantics, though the syntax can be made somewhat similar, as with VHDL that was derived from Ada, while Verilog was derived from C. However, the subset used for simulation/verification, outside the proper hardware blocks, can be pretty much identical with a programming language.
So in principle one could have a pair of harmonized languages, one a more or less typical programming language used for verification and a dedicated hardware description language used only for synthesis.
The current state is not too far from this, because many simulators have interfaces between HDLs and some programming languages, so you can do much verification work in something like C++, instead of SystemVerilog or VHDL. For instance, using C++ for all verification tasks is possible when using Verilator to simulate the hardware blocks.
I am not aware of any simulator that would allow synthesis in VHDL coupled with writing test benches in Ada, which are a better fit than VHDL with C++, but it could be done.
It's an intriguing idea. Having experience with software but almost none (only hobbyist) in hardware, I imagine it'd require a strong type system and mathematical foundation. Perhaps something like Agda, a language that is a proof assistant and theorem prover, with which one can write executable programs. https://en.wikipedia.org/wiki/Agda_(programming_language)
I wonder if an escape hatch like Rust's unsafe{} would be enough... a hardware{}. The real complexity likely lies in how to integrate the synthesis tools with the compiler and debugger. The timing model. A memory model like Rust's would certainly aid in assuring predictable behavior, but I'm not certain it would be sufficient.
In the past you could easily use Ada or anything else from Linux under Cygwin.
Nowadays, you should be able to use anything from Linux under WSL.
In the past using Ada was more painful, because you had to use some old version of gcc, which could clash with the modern gcc used for C/C++/Fortran etc.
However, during the last few years these problems have disappeared. If you build any current gcc version, you must just choose the option of having ada among the available languages and all will work smoothly.
AmbientTalk did this. I used it for a demo where I dragged a mp3 player's UI button to another machine, where pressing play would play it back on the originator's speakers. Proper actor programming in the veins of E and Erlang.
I wish more people knew about the Burroughs Large Systems[0] machines. I haven't written any code for them, but I got turned-on to them by a financial Customer who ran a ClearPath Series A MCP system (and later one of the NT-based Clearpath machines with the SCAMP processor on a card) back in the late 90s, and later by a fellow contractor who did ALGOL programming for Unisys in the mid-70s and early 80s. It seems like an architecture with an uncompromising attitude toward security, and an utterly parallel universe to what the rest of the industry is (except for, perhaps, the IBM AS/400, at least in the sense of being uncompromising on design ideals).
There is none as far as affine types go, even is there is a parallel to be made with limited types, but they don’t serve the same purpose.
The way Ada generally solves the same problem is by allowing much more in terms of what you can give a stack lifetime to, return from a function, and pass by parameters to functions.
It also has the regular « smart pointer » mechanisms that C++ and Rust also have, also with relatively crappy ergonomics
Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
> Why do we, as a species, ignore hard-won experience and instead restart?
Humanity moves from individual to society, not the reverse.
Some knowledge moves from the plural to the singular, top to bottom, but the regular existential mode is bottom-up, which point The Famous Article (TFA) makes in the context of programming languages.
Children and ideas grow from babe to adult. They do not spring full grown from the brow of Zeus other than in myth.
Thanks, that’s helpful. My wife is a teacher and talks about knowledge being recreated, not relearned: IOW it’s new to the learner even if known by the teacher. Hadn’t put those things together before.
Erm, well, the comment wasn’t AI generated, it was by me - a warts and all human. The sibling comments say TFA is AI generated and I’ll be the first to admit I didn’t spot that. Still found it interesting though.
> JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.
Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?
* The article claims only Ada has true separation of implementation vs specification (the interface), but as far as I am able to reason, also e.g. JavaScript is perfectly able to define "private" elements (not exported by an ES6 module) while being usable in the module that declares them -- if this isn't "syntactical" (and semantical) separation like what is prescribed to Ada, what is the difference(s) the article tries to point out?
* Similarly, Java is mentioned where `private` apparently (according to the article) makes the declaration "visible to inheritance, to reflection, and to the compiler itself when it checks subclass compatibility" -- all of which is false if I remember my Java correctly -- a private declaration is _not_ visible to inheritance and consequently the compiler can ignore it / fast-track in a subclass since it works much the same as it has, in the superclass, making the "compatibility" a guarantee by much the same consequence
I am still reading the article, but having discovered the above points, it detracts from my taking it as seriously as I set out to -- wanting to identify value in Ada that we "may have missed" -- a view the article very much wants to front.
I think that is the biggest factor of all.
Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.
And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.
The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.
I think more research is warranted by you on this subject.
It might be nice to have Ada's tasks driven by DOS interrupts, though. I think GNAT did this.
"Shortly before we finished our compiler, Borland came out with Turbo Pascal, and they were considering going into the Modula-2 market as well. In fact, Borland decided to buy our Modula-2 compiler to be sold under the name of Turbo Modula-2 for CP/M alongside an IBM PC version they wanted to develop. We offered to do the IBM PC version for them, but they told us they had it already covered. Unfortunately that version took them much longer than planned. By the time it came out, three or four years later, their implementor team had split from the company, and it became known as TopSpeed Modula-2. In the absence of an IBM-PC version, Borland never put any marketing muscle behind Turbo-Modula-2, so it remained rather obscure." -- https://www.artima.com/articles/the-origins-of-scala
Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).
These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.
Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.
It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.
I was around then, and culturally there just wasn't this (legitimate) concern with safety in the more "hacker" and Unix community generally. C won headspace at the time precisely because it was minimal and close to the metal while providing the minimum of abstraction people wanted. Which was on the whole fine because the blast radius for mistakes was lower and the machines were simpler.
Yes, I think this is key. I wasn't around in 1985, but on every attempt to write something in Ada I've found myself fighting its standard library more than using it. Ada's stdlib is an intersection of common features found in previous century's operating systems, and anything OS-specific or any developments from the last 30 years seem to be conspicuously absent. That wouldn't be so much of a problem if you could just extend the stdlib with OS-specific features, but Ada's abstractions are closed instead of leaky.
I'm sure that this is less of a problem on embedded systems, unikernels or other close-to-hardware software projects where you have more control over the stdlib and runtime, but as much as I like Ada's type system and its tasking model, I would never write system applications in Ada because the standard library abstractions just get in the way.
To illustrate what I mean, look at the Ada.Interrupts standard library package [0] for interrupt handling, and how it defines an interrupt handler:
That's sufficient for hardware interrupts, as that's pretty much how a hardware interrupt is instrumented: you have an entry point address, and that's it. But on Linux the same package is used for signal handling, and a parameterless procedure is in no way compatible with the rich siginfo_t struct that the kernel offers. Moreover, because the handler is parameterless you need to attach a separate handler to each signal to even know which signal was raised. To add insult to injury, the gnat runtime always spawns a signal handler thread with an empty sigprocmask before entering the main subprogram, so it's not possible to use signalfd to work around this issue either.Ada's stdlib file operations suffer from closed enumerations: the file operations Create and Open take a File_Mode argument, and that argument is defined as [1]:
That's it. No provisions for Posix flags like O_CLOEXEC or O_EXCL nor BSD flags like O_EXLOCK, and since enum types are closed in Ada there is no way to add those custom flags either. All modern or OS-specific features like dirfd on Linux or opportunistic locking on Windows are not easily available in Ada because of closed definitions like this.Another example is GNAT.Sockets (not part of Ada stdlib), which defines these address families and socket types in a closed enum:
Want to use AF_ALG or AF_KEY for secure cryptographic operations, or perhaps SOCK_SEQPACKET or a SOL_BLUETOOTH socket? Better prepare to write your own Ada sockets library first.[0] https://docs.adacore.com/live/wave/arm22/html/arm22/arm22-C-...
[1] https://docs.adacore.com/live/wave/arm22/html/arm22/arm22-A-...
Even in the late 90s Jamie Zawinski had a rant against C++. His argument for not using it? The compilers suck! C++ was the main "competitor" of Ada and it was a decade or more behind Ada through most of the time.
The "killer feature" of C++ against Ada (when it came to fighting against compiler maturity) was really that you could pretend to be writing C++ code but really just keep writing C-with-classes.
If Ada had put a modula or pascal compatibility mode in the language and produced a reference compiler that was based on a stable compiler in one of those languages, the history may have been different because people could have just written "PascAda" while waiting for the compilers to catch up.
In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.
Moreover, for a very long time GNAT had been quite difficult to build, configure and coexist with other gcc-based compilers, far more difficult than building and configuring the tool chain for any other programming language. (i.e. you could fail to get a working environment, without any easy way to discover what went wrong, which never happened with any other programming language supported by gcc)
I have no idea which was the reason for this, because whichever was the reason it had nothing to do with any intrinsic property of the language.
I do not remember when it has finally become easy to use Ada with gcc, but this might have happened only a decade ago, or even more recently.
I and many others have looked at Ada with some appreciation for decades. But the actual "community" around the language was foreign to me; government, defense contractors, etc places that frankly wouldn't even hire me.
It's got appealing constructs, and I grew up with the Wirth languages so I wans't put off by its syntax and style... and I even sometimes considered rewriting my OSS C++ pieces in it because I was so desperate for something better. But it was just a self-limiting box.
https://en.wikipedia.org/wiki/Intel_iAPX_432
The concept of iAPX 432 had been finalized before Ada won the Department of Defense competition.
iAPX 432 was designed based on the idea that such an architecture would be more suitable for high level languages, without having at that time Ada or any other specific language in mind.
The iAPX designers thought that the most important feature that would make the processor better suited for high-level languages would be to not allow the direct addressing of memory but to control the memory accesses in such a way that would prevent any accesses outside the intended memory object.
The designers have made many other mistakes, but an important mistake was that the object-based memory-access control that they implemented was far too complex in comparison with what could be implemented efficiently in the available technology. Thus they could not implement everything in one chip and they had to split the CPU in multiple chips, which created additional challenges.
Eventually, the "32-bit" iAPX432 was much slower than the 16-bit 80286, despite the fact that 80286 had also been contaminated by the ideas of 432, so it had a much too complicated memory protection mechanism, which has never been fully used in any relevant commercial product, being replaced by the much simpler paged memory of 80386.
The failure of 432 and the partial failure of 286 (a very large part of the chip implemented features that have never been used in IBM PC/AT and compatibles) are not failures of Ada, but failures of a plan to provide complex memory access protections in hardware, instead of simpler methods based on page access rights and/or comparisons with access limits under software control.
Now there are attempts to move again some parts of the memory access control to hardware, like ARM Cheri, but I do not like them. I prefer simpler methods, like the conditional traps of IBM POWER, which allow a cheaper checking of out-of-bounds accesses without any of the disadvantages of the approaches like Cheri, which need special pointers, which consume resources permanently, not only where they are needed.
Where it went wrong is in two areas: 1) as far as I know the 286 does not correct restart all instruction if they reference a segment that is not present. So swapping doesn't really work as well as people would like.
The big problem however was that in the PC market, 808[68] applications had access to all (at most 640 KB) memory. Compilers (including C compilers) had "far" pointers, etc. that would allow programs to use more than 64 KB memory. There was no easy way to do this in 286 protected mode. Also because a lot of programs where essentially written for CP/M. Microsoft and IBM started working on OS/2 but progress was slow enough that soon the 386 became available.
The 386 of course had the complete 286 architecture, which was also extended to 32-bit. Even when flat memory is used through paging, segments have to be configured.
https://datamuseum.dk/wiki/Rational/R1000s400
When I was a young lad, must have been 20 I came across some programming books, including programming in Ada.
I read so much of it but never wrote a line of code in it, despite trying. Couldn't get the build environment to work.
But the idea of contracts in that way seemed so logical. I didn't understand the difference this article underpins though. I learned Java and thought interfaces were the same.
Great article, great language.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
Examples would be a nice bonus but I think the author eschewed such because they weren't interested in writing a tutorial. They had a very specific point to make and stuck to it, resulting in a very informative but concise article that reads well because of its highly disciplined authorship.
So there was considerable borrowing from PASCAL, CLU, MODULA(-2), CSP. It's possible that the elaborate system for specifying machine representations of numbers was truly novel, but I'm not sure how much of a success that was.
There are features common to Ada and Modula, but those have been taken by both languages from Xerox Mesa.
The first version of Modula was designed with the explicit goal of making a simple small language that provided a part of the features of Xerox Mesa (including modules), after Wirth had spent a sabbatical year at Xerox.
Nowadays Modula and its descendants are better known than Mesa, because Wirth and others have written some good books about it and because Modula-2 was briefly widely available for some microcomputers. Many decades ago, I had a pair of UVPROM memories (i.e. for a 16-bit data bus) that contained a Modula-2 compiler for Motorola MC68000 CPUs, so I could use a computer with such a CPU for programming in Modula-2 in the same manner how many early PCs could be used with their built-in BASIC interpreter. However, after switching to an IBM PC/AT compatible PC, I have not used the language again.
However, Xerox Mesa was a much superior language and its importance in the history of programming languages is much greater than that of Modula and its derivatives.
Ada has taken a few features from Pascal, but while those features were first implemented in Pascal, they had been proposed much earlier by others, e.g. the enumerated types of Pascal and Ada had been first proposed by Hoare in 1965.
When CLU is mentioned, usually Alphard must also be mentioned, as those were 2 quasi-simultaneous projects at different universities that had the purpose of developing programming languages with abstract data types. Many features have appeared first in one of those languages and then they have been introduced in the other after a short delay. Among the features of modern programming languages that come from CLU and Alphard are for-each loops and iterators.
It was by far the best language that I used for my entire working career where I had to endure such languages as PL/1 (and PL/S), C, C++, Java, JavaScript and PHP. While Java as a lang was not too bad it still paled in features and usability compared to MESA and it too was influenced by MESA.
But as was true at Xerox was it was the complete network that was revolutionary at the time in the early 80’s. The fact that I could source debug any machine remotely on the corporate would wide network of over 5000 machine and that the source code would be automatically done loaded to my machine (mean I could easily debug from any nearby random machine) was just something I could never “easily’ do elsewhere.
MESA was missing a few things (which CEDAR solved and used generally within only Xerox PARC partially because at the time it really only ran on Dorado class machine) such as Garbage collection and in the case of Star it would have been much better if the language supported OOP. For Star we had had a system called Traits to support objects but it had some serious issues IMHO (which would be fodder for a separate post.)
When talking about Mesa you also need to talk about Tajo, its development environment built onto- of the OS Pilot (Star also used Pilot.) But systems also supported a mouse and a large bitmapped monitor and had overlapping windows (although most of Star have automatic non overlapping windows that was a UI usability decision.)
There is also more because the network was very important. Print severs, file servers, mail servers, cloudless store for all of Star’s user files/desktop. All this in the very early 80’s was unheard of elsewhere. It’s very similarly to what Steve Jobs missed when he saw Smalltalk where he only really saw a new UI and missed much more that was demoed.
It was a magic place to work at the time, I had left in the very late 80s for Apple and it was a huge step backwards at the time (but did amazing stuff with their limited tools but made working not fun.)
https://bitsavers.org/pdf/xerox/mesa/
Some information may also exist in other subdirectories of "pdf/xerox".
There have been many references to Mesa in the research articles and the books published towards the end of the seventies and during the eighties, but those are hard to find today, as most of them may have not been digitized. Even if they were digitized, it is hard to search through them to find the relevant documents, because you would not know from the title whether Mesa is also discussed along with other programming languages.
In general, bitsavers.org is probably the most useful Internet resource about old computing hardware and software, because no secondary literature matches the original manuals of the computer vendors, which in the distant past had an excellent quality, unlike today.
Ada provides many features of Mesa, but not all of them and I regret that some Mesa features are missing from the languages that are popular today.
The loops with double exits of Python (i.e. with "else") have been inspired by Mesa, but they provide only a small subset of the features available in Mesa loops.
"JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers."
Then:
"in Ada, the implementation of a private type is not merely inaccessible, it is syntactically absent from the client's view of the world."
Am I missing something -- a JavaScript module is perfectly able to declare a private element by simply not exporting it, accomplishing what the author prescribes to Ada as "is not merely inaccessible, it is syntactically absent from the client's view of the world"? Same would go for some of the other language author somewhat carelessly lumps together with JavaScript.
I loved the article, and I have always had curiosity about Ada -- beyond some of the more modern languages in fact -- but I just don't see where Ada separates interface from implementation in a manner that's distinctly better or different from e.g. JavaScript modules.
In other words, if module X returns a value x of unexported type T to module Y, code in module Y is free to do x.foo = 42.
To preempt the obvious: yes, I know _everything_ (nearly) in JavaScript is an object, but a module exporting a `Function` can expect the caller to use the function, not enumerate it for methods. And the function can use a declaration in the module that wasn't exported, with the caller none the wiser about it.
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
JOVIAL had been derived from IAL (December 1958), the predecessor of ALGOL 60. However JOVIAL was defined before the final version of ALGOL 60 (May 1960), so it did not incorporate a part of the changes that had occurred between IAL and ALGOL 60.
The timeline of Ada development has been marked by increasingly specific documents elaborated by anonymous employees of the Department of Defense, containing requirements that had to be satisfied by the competing programming language designs:
1975-04: the STRAWMAN requirements
1975-08: the WOODENMAN requirements
1976-01: the TINMAN requirements
1977-01: the IRONMAN requirements
1977-07: the IRONMAN requirements (revised)
1978-06: the STEELMAN requirements
1979-06: "Preliminary Ada Reference Manual" (after winning the competition)
Already the STRAWMAN requirements from 1975 contained some features taken from JOVIAL, which the US Air Force used and liked, so they wanted that the replacement language should continue to have them.
However, starting with the IRONMAN requirements, some features originally taken as such from JOVIAL have been replaced by greatly improved original features, e.g. the function parameters specified as in JOVIAL have been replaced by the requirement to specify the behavior of the parameters regardless of their implementation by the compiler, i.e. the programmer specifies behaviors like "in", "out" and "in/out" and the compiler chooses freely how to pass the parameters, e.g. by value or by reference, depending on which method is more efficient.
This is a huge improvement over how parameters are specified in languages like C or C++ and in all their descendants. The most important defects of C++, which have caused low performance for several decades and which are responsible for much of the current complexity of C++ have as their cause the inability of C++ to distinguish between "out" parameters and "in/out" parameters. This misfeature is the reason for the existence of a lot of unnecessary things in C++, like constructors as something different from normal functions, and which cannot signal errors otherwise than by exceptions, of copy constructors different from assignment, of the "move" semantics introduced in C++ 2011 to solve the performance problems that plagued C++ previously, etc.
https://xcancel.com/Iqiipi_Essays
There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.
Sadly the review meetings focused on document structure instead of thoughtful software design and analysis. ADA didn't help; it was cumbersome to get working well, and ADA experience in the contracting agencies was low. The waterfall approach made the projects slow to implement.
https://en.wikipedia.org/wiki/DOD-STD-2167A?wprov=sfti1
"These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."
The entire site is AI written.
Obviously this article was highly pleasing to the hn audience as it’s currently sitting at #1. It’s still garbage, because it doesn’t have any interesting ideas behind it. Certainly not commensurate with its length.
I really enjoyed the essay, only checked afterwards when I started reading comments.
I hate that I'm starting to develop a media literacy immune system for blog posts of all things.
I've written almost 50 blog posts in the last 3 years. All in draft, never published mostly because a crippling imposter syndrome and fear of criticism. But every now and then I wake up full of confidence and think "this is it. today I'll click publish I don't give a fuck. All in". Never happens. Maybe this author was in the same boat until a month ago. I know there's a high chance that's just a bot but I can understand if it's not and how devastating has to be to overcome the fear of showing your thoughts to the world and being labeled a bot. If it's not already obvious English is not my first language and I've used LLMs to check my grammar and improve the style. Maybe all my posts smell like chatpgt now and this just adds to the fear of being dismissed as slop.
The main problem with this article is that it appears to have been basically written out of whole cloth by the LLM, there’s no novel insight here about Ada beyond what you could fit in a short prompt + the Wikipedia article.
Most longform readers will assume an author has deep expertise and spent a lot of time organizing their thoughts, which lends their ideas some legitimacy and trust. For a small blog, an 8,000 word essay is a passion project.
But if AI is detected in the phrasing and not disclosed, it begs a lot of questions. Did AI write the whole thing, or just light edits? Are the facts AI generated, too, and not from personal experience? What motivated someone to produce this content if they were going to automate parts of its creation; why would they value the output more than the process?
You define the interface, types, pre/post conditions you want in .ads file, then let the agent loose writing the .adb body file. The language’s focus on readability means your agent has no problem reading and cross referencing specs. The compiler and proof tools verify the body implements the spec.
IMO, this was the telling paragraph.
Ada is "verbose" in that it has fairly rigorous type specification. It was verbose in comparison to languages that had weak or primitive typing. A lot of the "bureaucracy" in the language is being very specific about types to catch bugs.
Ada 83 did have a problem in that it lacked [interfaces]. This could sometimes limit code reuse.
Ada was designed to be "readable" but so was Pascal and many other languages (and, more recently for instance, Python). "Readability" in those days mainly meant preferring keywords over operators and allowing for infix notation with proper order-of-operations.
https://www.adaic.org/resources/add_content/standards/05rat/...
The C people tried to blame the crash on Ada's use of exceptions. At the time, exceptions were controversial. The actual crash came after an exception was fired and the C folks insisted that C would have just ignored the error state and carried on. Except that the exception was actually the software manifestation of a hardware signal that would have crashed a C program as well.
Ada had a lot of haters, mainly because it was imposed top-down in a lot of organizations. But also because there was a lot of money behind C and other technologies. C++ was vaporware at the time and was able to promise to be a better version of everything Ada was (just you wait!).
Personally Ada was the coolest language I've ever learned and I still love playing with it.
While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,
> NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types
and a modern language like Haskell has origins in Hope (from 1980) through Miranda.
[1] https://en.wikipedia.org/wiki/Hope_(programming_language)
John McCarthy, the creator of LISP, had also many major contributions to ALGOL 60 and to its successors (e.g. he introduced recursive functions in ALGOL 60, which was a major difference between ALGOL 60 and most existing languages at that time, requiring the use of a stack for the local variables, while most previous languages used only statically-allocated variables).
The "union" of McCarthy and of the languages derived from his proposal is not the "union" of the C language, which has used the McCarthy keyword, but with the behavior of FORTRAN "EQUIVALENCE".
The concept of "union" as proposed by McCarthy was first implemented in the language ALGOL 68, then, as you mention, some functional languages, like Hope and Miranda, have used it extensively, with different syntactic variations.
If Rust didn't have (C-style) unions then its enum should be named union instead. But it does, so they needed a different name. As we work our way through the rough edges of Rust maybe this will stick up more and annoy me, but given Rust 1.95 just finally stabilized core::range::RangeInclusive, the fix for the wonky wheel that is core::ops::RangeInclusive we're not going to get there any time soon.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
A reconciliation between these 2 camps appears impossible. Therefore I think that the ideal programming language should admit 2 equivalent representations, to satisfy both kinds of people.
The pro-verbose camp argues that they cannot remember many different symbols, so they prefer long texts using keywords resembling a natural language.
The anti-verbose camp, to which I belong, argues that they can remember mathematical symbols and other such symbols, and that for them it is much more important to see on a screen an amount of program as big as possible, to avoid the need of moving back and forth through the source text.
Both camps claim that what they support is the way to make the easiest to read source programs, and this must indeed be true for themselves.
So it seems that it is impossible to choose rules that can ensure the best readability for all program readers or maintainers.
My opinion is that source programs must not be stored and edited as text, but as abstract syntax trees. The program source editors and viewers should implement multiple kinds of views for the same source program, according to the taste of the user.
To be clear Ada specifically talks about all this in the Ada reference manual in the Introduction. It was specifically designed for readers as opposed to writers for very good reasons and it explains why. It's exactly one of the features other languages will eventually learn they need and will independently "discover" some number of years in the future.
This has never been solved, but it could have been solved if there would have been a standard about the use of symbols in programming languages and all languages would have followed it.
Nevertheless, for some symbols this problem does not arise, e.g. when traditional mathematical symbols are used, which are now available in Unicode.
Many such symbols have been used for centuries and I hate their replacements that had to be chosen due to the constraints of the ASCII character set.
Some of the APL symbols are straightforward extensions of the traditional mathematical notation, so their use also makes sense.
Besides the use of mathematical symbols in expressions, classic or Iverson, the part where I most intensely want symbols, not keywords, is for the various kind of statement brackets.
I consider the use of a single kind of statement brackets as being very wrong for program readability. This was introduced in ALGOL 60 (December 1958) as the pair "begin" and "end". Other languages have followed ALGOL 60. CPl has replaced the statement brackets with paragraph symbols (August 1963), and then the language B (the predecessor of C) has transitioned to ASCII so it has replaced the CPL symbols with curly braces, sometimes around 1970.
A better syntax was introduced by ALGOL 68, which is frequently referred to as "fully bracketed syntax".
In such a syntax different kinds of brackets are used for distinct kinds of program structures, e.g. for blocks, for loops and for conditional structures. This kind of syntax can avoid any ambiguities and it also leads to a total number of separators, parentheses, brackets and braces that is lower than in C and similar languages, despite being "fully bracketed". (For instance in C you must write "while (condition) {statements;}" with 6 syntactic tokens, while in a fully bracketed language you would write "while condition do statements done", with only 3 syntactic tokens)
If you use a fully bracketed syntax, the number of syntactic tokens is actually the smallest that ensures a non-ambiguous grammar, but if the tokens are keywords the language can still appear as too verbose.
The verbosity can be reduced a lot if you use different kinds of brackets provided by Unicode, instead of using bracket pairs like "if"/"end if", "loop"/"end loop" or the like.
For instance, one can use curly braces for blocks, angle brackets for conditional expressions or statements, double angle brackets for switch/case, bag delimiters for loops, and so on. One could choose to use different kinds of brackets for inner blocks and for function bodies, and also different kinds of brackets for type definitions.
In my opinion, the use of many different kinds of brackets is the main feature that can reduce program verbosity in comparison with something like Ada.
Moreover, the use of many kinds of brackets is pretty much self describing, like also in HTML or XML. When you see the opening bracket, you can usually recognize what kind of pattern starts, e.g. that it is a function body, a loop, a block, a conditional structure etc., and you also know how the corresponding closing bracket will look. Thus, when you see a closing bracket of the correct shape you can know what it ends, even when you had not known previously the assignment between different kinds of brackets and different kinds of program structures.
In languages like C, it is frequently annoying when you see many closing braces and you do not know what they terminate. Your editor will find the matching brace, but that wastes precious time. You can comment the closing braces, but that becomes much more verbose than even Ada.
So for me the better solution is to use graphically-distinct brackets. Unicode provides many suitable bracket pairs. There are programming fonts, like JetBrains Mono, which provide many Unicode mathematical symbols and bracket pairs.
When I program for myself, I use such symbols and I use a text preprocessor before passing the program to a compiler.
This was proposed, as a joke, some years ago: https://www.adacore.com/blog/a-modern-syntax-for-ada
This is an old but good article on the topic: https://www.embedded.com/expressive-vs-permissive-languages-... Note that SPARK has changed significantly since this was written.
Nevertheless in is possible to define a complete 1 to 1 mapping of all Ada syntactic tokens to a different set of tokens.
The resulting language will have exactly the same abstract syntax as Ada, so it is definitely exactly the same language, only with a different appearance.
For a seasoned Ada programmer, changing the appearance of the language may be repugnant, but for a newbie there may be no difference between two alternative sets of tokens, especially when the programmers are not native English speakers, so they do not feel any particular loyalty to words like "begin" and "loop", so they may not feel any advantage of using them instead of using some kind of brackets that would replace them.
I don’t think you really understand what you’re saying here. I have worked on an ada compiler for the best part of a decade. It’s one of the most complex languages there is, up there with C++ and C#, and probably rust
As a matter of general interest, what features or elements of Ada make it particularly hard to compile, or compile well? (And are there parts which look like they might be difficult to manage but aren't?)
https://github.com/JeremyGrosser/softdev/tree/master/src
The subset required for hardware synthesis/design, cannot be unified completely with a programming language, because it needs a different semantics, though the syntax can be made somewhat similar, as with VHDL that was derived from Ada, while Verilog was derived from C. However, the subset used for simulation/verification, outside the proper hardware blocks, can be pretty much identical with a programming language.
So in principle one could have a pair of harmonized languages, one a more or less typical programming language used for verification and a dedicated hardware description language used only for synthesis.
The current state is not too far from this, because many simulators have interfaces between HDLs and some programming languages, so you can do much verification work in something like C++, instead of SystemVerilog or VHDL. For instance, using C++ for all verification tasks is possible when using Verilator to simulate the hardware blocks.
I am not aware of any simulator that would allow synthesis in VHDL coupled with writing test benches in Ada, which are a better fit than VHDL with C++, but it could be done.
And every time I fail.
https://ada-lang.io/
https://alire.ada.dev/
Nowadays, you should be able to use anything from Linux under WSL.
In the past using Ada was more painful, because you had to use some old version of gcc, which could clash with the modern gcc used for C/C++/Fortran etc.
However, during the last few years these problems have disappeared. If you build any current gcc version, you must just choose the option of having ada among the available languages and all will work smoothly.
IDE: https://github.com/AdaCore/gnatstudio
https://soft.vub.ac.be/amop/
CTM: https://en.wikipedia.org/wiki/Concepts,_Techniques,_and_Mode...
[0] https://en.wikipedia.org/wiki/Burroughs_Large_Systems
1: SPARK is a formally verifiable subset of Ada: https://en.wikipedia.org/wiki/SPARK_(programming_language)
2: https://arxiv.org/pdf/1805.05576
https://learn.adacore.com/courses/advanced-ada/parts/resourc...
The way Ada generally solves the same problem is by allowing much more in terms of what you can give a stack lifetime to, return from a function, and pass by parameters to functions.
It also has the regular « smart pointer » mechanisms that C++ and Rust also have, also with relatively crappy ergonomics
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
Humanity moves from individual to society, not the reverse.
Some knowledge moves from the plural to the singular, top to bottom, but the regular existential mode is bottom-up, which point The Famous Article (TFA) makes in the context of programming languages.
Children and ideas grow from babe to adult. They do not spring full grown from the brow of Zeus other than in myth.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.
Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?
1. Would never work on "missile tech" or other "kills people" tech.
2. Would never work for (civ) aircraft tech, as i would probably burn out for the stress of messing something up and having a airplane crash.
That said, im sure its also used for stuff that does not kill people, or does not have a high stress level.