dBase: 1979-2026

(delphinightmares.substack.com)

108 points | by deeaceofbase 3 days ago

28 comments

  • susam 13 hours ago
    I remember dBASE IV from my childhood days when my father, who had no computer background, was required to take computer training by his workplace. My father and his colleagues were given free evening computer lessons by their company, taught by the same teachers who used to teach us, the kids, computers in our school.

    After their first class, he brought home a fat dBASE IV manual. Since I was very interested in computer books, I read a good portion of it even though I had never touched dBASE in my life. I would daydream of all the little forms, queries, reports and labels I could make with dBASE. But I never got to touch dBASE in my life. We kids used to get LOGO lessons instead in school.

    One day my father came back from his evening lesson mildly distressed about something he had learnt. He said they were being taught loops but in the loop there was an equation that seemed just plain wrong. It was:

      i = i + 1
    
    How could that be a valid equation? How could i ever equal i + 1? He mentioned that he had asked the teacher about it and from what I could gather, my teacher and my father were talking past each other. The teacher probably tried explaining that it was not an equation but an instruction instead, whereas my father continued to interpret i = i + 1 as an equation due to the algebra he was so familiar with. It sort of held up the class for a while.

    The teacher asked my father's name, perhaps so that he could talk to him separately later. But when he learnt my father's name, he realised that his son, me, went to the same school where he taught. So he told my father, 'When you get back home, ask your son about i = i + 1. He will explain it to you better than I am able to.'

    And indeed I was able to explain it to him pretty well. I was eight or nine years old back then. And that was probably the first thing I taught my father!

    • hh2222 7 hours ago
      Perhaps this is why BASIC originally used LET in the form LET X = X + 1, to imply a calculation rather than equality.
      • ASalazarMX 15 minutes ago
        As kids we had the same debate because we were taught algebra before BASIC, and naturally tried to interpret it mathematically. Fortunately, sometimes a kid can explain it better to other kids than an adult, and that was the case.
      • weinzierl 4 hours ago
        The = for assignment is FORTRAN’s fault. In the beginning there was no equality, just assignment, and FORTRAN (being just a FORmula TRANslator after all) made the somewhat dubious decision to use = for that (punch card space being sparse and symbols limited and all).

        When FORTRAN gained equality it went for .EQ. out of practicality and necessity. Many others followed suit but used the somewhat more pleasant == instead of .EQ..

        But it didn’t have to happen that way. ALGOL decided to stick close to mathematical tradition:

        = for equality

        := for assignment ("definition")

        While x := x + 1 is still not clean mathematical notation, I think it wouldn’t have riled up OP’s father as much. If he’d squinted enough, he might even have been able to see little indices below the x’s there.

        • teddyh 3 hours ago
          A rumor I heard was that

            x := 4
          
          was chosen because it looks similar to

            x ⇐ 4
          • badsectoracula 3 hours ago
            Why not <= then? I'd expect both < and = to be available.
            • tandr 1 hour ago
              Probably because '<=' is very easily read as "less or equal"? (unless you are joking, of course)
      • cryo32 7 hours ago
        Think that was just to make the parser understand it was an assignment without having to do any lookahead. There was also possibly ambiguous stuff with equality and assingment as they both used =.
    • hcs 13 hours ago
      Some of my earliest programming exposure was a dBASE IV book my dad had for work, though it was some time before I put any of it into action. At that time I was reading manuals like fiction, only slowly realizing that I could actually use some of it with our computer.
    • antonvs 11 hours ago
      It’s a pity your father’s perspective didn’t prevail. We’d all be using better programming languages now.
      • pjc50 9 hours ago
        It's a notational issue. IIRC Pascal used := for assignment and = for equality testing.

        Where this becomes extremely Rorsarch is the spectrum between "notation is absolutely critical: there is only one correct representation of programs in people's heads and we have to match that exactly" vs. "all program text is ultimately syntactic sugar and programmers will just adapt to whatever". History tells us that the C choice of = for assignment and == for equality testing won, but of course that's not a choice in a vacuum and it's tied up with a thousand other choices.

        • gpderetta 8 hours ago
          I think parent was alluding to mutability.
          • kccqzy 1 hour ago
            Yes! In lazy but immutable languages like Haskell, it is totally fine to refer to the value itself during definition. This is really the same idea that a recursive function can refer to itself during definition. It’s common to define a variable for the infinite list of prime numbers, where the definition requires the list of prime numbers itself.

                primes = 2 : sieve primes [3..]
                sieve (p:ps) xs = let (h, t) = span (< p*p) xs in h ++ sieve ps (filter (\n -> rem n p > 0) t)
            
            Here `primes` is a variable that refers to itself in its definition (called corecursion), and `sieve` is a recursive function.
          • pjc50 7 hours ago
            While I'm a big fan of immutable design, it makes some algorithms much more expensive and ultimately DRAM is mutable. And the example we're talking about could be a loop counter!
    • piokoch 4 hours ago
      This tells a lot of about your dad - he must be an amazing person, so he was ready to learn something from his kid. Not everyone would be able to admit that 9 y.o. can teach him something :)
  • vintagedave 10 hours ago
    I find this blog consistently negative (check other posts.) Although this post is interesting, and I know little about dBase and this is a sad story, I am simply not sure how accurate all of the blog as a whole is. I can best suggest, take its statements as someone's personal opinion, not necessarily as fact, ie but with a pinch of salt.

    > It is believed that - alongside the BOLD source code (missing for more than 10+ years), the BDE and many original dBase source code was lost during the ill fated Borland + Corel merger (which was eventually called off).

    This is confusing because the article is supposedly about dBase, and I have no idea why Bold is relevant. It's an example of where I feel the general negativity of the blog veers into random discussions.

    To the best of my knowledge, the Bold source was not lost. In fact, Embarcadero open sourced it several years ago. The blog post has details: https://blogs.embarcadero.com/bold-for-delphi-is-open-source... I worked there at the time though I did not drive its open sourcing, but it is a positive move, and clearly contrasts the blog's statement. It appears actively maintained and updated these days. I would differentiate 'lost' from 'owned but not made available publicly'.

    • browningstreet 3 hours ago
      I struggled with the post too, there's something off about the writing.
    • projektfu 4 hours ago
      My reading is that the BOLD source was lost and found but that BDE and some parts of dBase were lost completely, evidenced by them including an unmodified version year after year. Perhaps the author thought people were more aware of the BOLD source being lost.
    • jjkaczor 5 hours ago
      ... Well - with a blog title of "Delphi Nightmares", I would expect uh critical or negative commentary... It's right there in the name.
  • coldcity_again 13 minutes ago
    dBase was one of my first exposures to databases. As a largely penniless computer-obsessed kid in the late 80s/early 90s I was big into sending off for any freebies offered in the ads of the pages of UK BYTE.

    Somehow I scored a (stripped down?) copy on multiple floppies that I couldn't even use, as an Amiga owner - but incredibly, this freebie came with a fantastic paper manual which I devoured.

  • c_prompt 4 hours ago
    Don't forget DataEase [1]. That's what I eventually moved to from dBase (although, IIRC, it was through back-and-forth evaluations of FoxPro, Clipper, and Paradox). DataEase was considered a "Fourth-Generation Language" (4GL) and it was wonderful to work with. As a teenage "systems analyst" working for a division of GE (my first paid tech job), I built a file room management system for their large file rooms (remember those?). Having to thoroughly test security, I put it through its paces and found a way to hack into any application built on DataEase. Eventually explaining the procedure to DataEase's development team (which included one developer traveling to my fraternity house for a face-to-face meeting; so funny trying to be business-like in a place with sticky floors and smelled like stale beer), they fixed the hole. There weren't any bug bounties during those days but, as a reward, they gave me lifetime upgrades and allowed me to go to all their training seminars for free. It was my 4GL experience that ultimately led to learning Cognos.

    Funny aside: I remember the first time my GE boss asked me for an invoice as it was the only way he could pay me. I had no idea what it should look like. So he sent me to the PM of one of the COBOL contractor teams who gave me a template that I copied. The PM eventually asked me to do some COBOL programming for them as well. Good times.

    [1] https://www.dataease.com/

  • michibertel 8 hours ago
    I still maintain a VFP9 project from time to time. Although AI has been extremely helpful in writing VFP9 code, I can't imagine migrating this enormous project, which has grown over the course of 30 years, to a more modern system by feeding the source code to AI.

    While one could debate which approach would be best for migrating such a project, an 'AI-led Big Bang Migration' would be insane.

    However, AI would certainly be helpful for migration.

    • jamal-kumar 6 hours ago
      If you're ever looking to migrate off of that a better starting point is finding a way to dump the DBF files into CSV (there's a perl script for this that works wonders called DBF2CSV but I hear LibreOffice can just open these files too)...

      After inheriting a project where the source code CDROM went missing I can definitely see a use case for at least trying with the latest frontier models to rescue the logic because it took me a while to reverse that thing manually to fix a bug with radare2

  • bux93 9 hours ago
    The article states

    >By feeding legacy PRG (circa 1985) and logics to models like Claude, ChatGPT, developers can now instruct the AI to translate decades-old dBase PRG directly into memory-safe Rust, highly concurrent Go, or modern Dart/Flutter cross-platform applications.

    And it alludes to this early on, but it doesn't show any examples.

    • badsectoracula 3 hours ago
      I don't know about converting it to "high concurrent Go" or anything like that, but after searching a bit, i found some old PRG code for dBase III Plus someone posted in a googlegroup, gave it to Devstral Small 2 (local model) and asked it to convert it to C# and the conversion looked fine to me, using .NET's database functionality, etc. It is too long to post it though (also TBH the code had some questionable fields).

      In general LLMs seem to be very good at translating between programming languages and something like PRG uses very straightforward syntax and concepts. The first attempt by the LLM did a mostly one-to-one conversion using the console but i asked it to convert it to Windows Forms and the code looked fine for that too, using appropriate controls for the fields like text, combobox or datetime pickers (though it used fixed coordinates for the controls so i'm not sure if that looked fine).

      FWIW, i didn't try to run the code (i'm on Linux and i do not even have anything related to C# on my PC nor a DB to work with :-P) and chances are there might be some subtle mistakes, but it looked like a decent starting point. IME, at least with local models, converting code between languages in a piecemeal fashion is trivial even with weird/less common languages (you may need to put some instructions to the LLM on a few edge cases though). And IMO that approach would be the right way to do it instead of dumping the entire codebase to it and hoping for the best :-P

    • e12e 8 hours ago
      As an alternative to leaving this to an LLM I came across:

      https://github.com/infused/dbf/

      I'm not sure what the article suggests - create a custom rust program that reads and writes to a given dbf file? Create a rust program that mirrors the PRG code, writing/reading data in a custom format?

      • stblack 4 hours ago
        The answer is going to be, both. With ease.
  • allenu 1 hour ago
    I have vivid memories of thick dBase manuals on shelves in offices wherever somebody had an IBM PC or compatible computer. As a kid, I had no idea what it was, but the thick grey books made me think they must be a very important thing indeed. Just seeing the name dBase immediately evoked the memory of those books. Eventually I did get into computers and programming languages in my early teens but never did figure out what this whole dBase thing was.
  • orionblastar 3 days ago
    Microsoft Access 2.0 had filters to import and export data from and to DBF files. We used this in WFW 3.11 to convert from DBase to MS-Access and later on SQL Server.

    There were some Turbo C and Turbo Pascal source code that read DBF files, but hardly anyone used them. Most stored data is in text files that can be read by any application.

    • boshomi 7 hours ago
      MS Access can use DBF files almost as if they were standard Access tables. This was particularly useful when working with ESRI Shapefiles, as it allowed the DBF files to be edited in Access and the changes to be viewed directly in ArcGIS. When editing maps, Access was often more convenient than the ESRI Editor.
    • pjmlp 5 hours ago
      I have one of such Turbo Pascal libraries for dBase access, bought via ads on the Portuguese programmer's magazine at the time, Spooler.
  • mercurialuser 14 hours ago
    You may compile .prg files with Harbour, an open source Clipper clone compiler, on github.

    Strange it is not cited in the post.

  • cwmma 4 hours ago
    Shapefiles, the legacy multifile file format for geospatial stuff that is still an incredibly important interchange format since it has basically universal support is built around DBF files.

    So despite it being an incredibly old file format it's still used constantly in the GIS world and it's probably not going to go away because while it's not a good format, it does basically everything at to at least a mediocre level which can't be said for any of the newer formats that tend to do a few things great but other things terribly.

    Like Geojson is great for interchange but you can't really do in place edits or even in place seeking from disk, shapefile can.

    Sqlite allows great editing and seeking but you can't use that in a browser without doing something complicated like compiling the sqlite binary to js or wasm.

  • pjmlp 13 hours ago
    One of my favourite DB systems, started with dBase III+ where our teacher made us enter the high-school library records, followed up with Clipper Summer '87, and shortly thereafter Clipper 5.x with its OOP extensions.

    Great productivity tool, garbage collected, compiled, in the constrained environment of MS-DOS PCs.

    The migration to Windows 3.1 took too much time, giving time to FoxPro, Access, Visual Basic and Delphi to establish themselves to the same programming communities.

    Similar to other HNers, Clipper was also how I made my first attempts to working for others during high school.

    • vasac 9 hours ago
      Ah, Clipper was a major force in the late ’80s in Yugoslavia, as economic reforms enabled the widespread establishment of private companies that needed accounting software, and PCs became cheap enough for one-person shops to develop custom accounting solutions. It was the Wild West for a few years, with a zillion different applications, until some bigger players emerged.

      IIRC, it needed one or two 360K floppies for a full install (a pirated copy; maybe the legal distribution was larger - at that time, all software was pirated). Compiling was fast (on a computer where you type dir and can read the filenames appearing on the screen faster than the computer can print them), but linking was slow, so everyone replaced MS Link with Borland’s TurboLink, which was an order of magnitude faster. It didn’t support overlays, but there were ways to work around that.

      There was also documentation available in some third-party TSR app.

      Later, another linker became popular: Blinker, which had a bunch of interesting features, such as loading overlays into EMS memory and providing various security functions to help protect your software. But by that time, the writing was already on the wall for DOS.

      Funnily enough, many customers actually preferred DOS, since navigating with the keyboard was far faster than using a mouse, and Windows apps generally weren’t designed with keyboard navigation in mind.

      • pjmlp 8 hours ago
        Ah, Blinker! Never used it, but remeber the ads in magazines.

        Same in Iberian penisula regarding software acquisition, even during university, the same copy centers for books, also offered catalogs of which software we would like to have, or street baazars even, only in the 2000's the goverment (in Portugal) actually started hunting down those practices.

  • julianz 15 hours ago
    My very first paid gig, aged 12, was figuring out how to print mailing labels from a Bondwell CP/M laptop running dBase II. Didn't enjoy it.
    • thbb123 14 hours ago
      Very similar story, in 1982: got paid at 15 to create a prospect database for a small business. What now would be called a CRM.

      The enterprise had to declare me as an apprentice for 'trade jobs', as it was against the law to give a regular salary to someone under 16.

      I remember my first paycheck with deductions for retirement, which pissed me off quite a bit.

  • peterpanhead 2 hours ago
    I wonder if there will be a drop of a new relational database to take on Postgres with a similar permissive license. Postgres isn’t great for every case. MySQL and MariaDB are not warranted as they can disappear at any time.
  • pkphilip 8 hours ago
    The Borland/Inprise/Embarcadero mess impacted a lot of software.

    They could have still been the king of the hill now if it weren't for the suits who completely ruined it after Philippe Kahn left the scene.

  • mamcx 13 hours ago
    Resurrect this kind of language is one of my goals (https://tablam.org) but of course with different takes.

    I think the main gist: you work not as app developer but as db developer, is something that is missing in some partial attempt like access and such.

    BTW: Wanna join me or help?

    • smackeyacky 11 hours ago
      I want to say no. As a way of working those dbms systems were a dead end. Not every problem is database tables and having had a job replacing a dBase III system I never want to see it or its ilk again
      • mamcx 3 hours ago
        I fully aware of the limitations of that tools (I worked professionally as Foxpro developer), that is what I said about resurrect the spirit but not the way is implemented.

        The idea is to make things "relational" (with improvements) instead of fully "physical database tables" that is what tied you into a binary format.

        With this, this binary format stop to begin opaque, and can be even be represented (tables and such) textually or by "standard" outputs like Msgpack.

        So think that `data Customer` has decoupled the idioms, programatic interface to their specific storage. In rust terms, each `data` is `serde`, so you can change and move between how physically things are represented.

        BTW this is how the relational idea was mean to be used.

      • actionfromafar 10 hours ago
        80% of everything is crap anyway, no matter which tech stack. But I think something was lost, not everything is a database, but ever since Microsoft started ignored MS Access, nothing is a database. Or rather, Excel is used as a database. That can't be good either.
        • smackeyacky 9 hours ago
          Oh 100% agree on Excel - it's no substitute for those dBase/Clipper/Fox systems.

          Y'know what? It's probably true that niche needs filling again as long as it isn't the dBase file format. I had to deal with one system that blew the documented max file size for dBase III but for some bizarre reason, the original dBase III executable didn't care.

          However, you couldn't load it with any of the ODBC drivers it would fail. Except for one obscure Sybase based driver I have forgotten the details of.

          Just couldn't deal with it again I don't think.

          • mamcx 3 hours ago
            Yeah, this is totally correct. What was great is the ability to do things like `SELECT name FROM NameOfForm` (in Foxpro, `forms` where stored in tables, so you can do sql on them), but what I say is that is the "free" query interface that is great, but if the actual thing is stored as json, csv, sqlite or whatever is orthogonal.
  • zer 7 hours ago
    One of my first jobs had a legacy application that was based on Clipper. I never had to work with it directly (since I worked on the SQL successor) but reading the source code was still fascinating.

    Semi off-topic: The wikipedia article on Ed Esber is in dire need of a clean up https://en.wikipedia.org/wiki/Ed_Esber

  • Crontab 3 hours ago
    I worked at a business that managed its inventory with dBase IV For DOS. I used it from 1997 to 2012, when the business closed. I personally liked it.
  • tobad357 15 hours ago
    I feel the timeline is wrong re when dBase Inc took over. I remember working as a consultant on shipping new features for dBase back in 2000 or so.

    I implemented reflection for the dBase language and was also part of trying to convert it to Visual C++ instead of using the Borland compiler. I was very green back then but it was interesting, my only time dealing with interpreters / compilers

  • flr03 7 hours ago
    I remember my father built its own personal accounting tool using dbase, I think it was MSDOS at the time, I was a kid. Quite the achievement I think, he was not a software engineer, just hobbyist.
  • yathartha 3 days ago
    Fascinating obituary for dBase; software history repeats through neglect, litigation, complacency.
  • cyri 13 hours ago
    In 1998 I wrote a financial summary for our ERP system (EUROnet) running on MS DOS with a dBase db in the backend. I've connected the dBase to a PHP 3 web server with Apache 1 and then summarized the sales data. My boss loved it. He could see numbers which are not implemented in the ERP reports.
  • jhbadger 14 hours ago
    My first gig at 18 was managing my university library's database (in dBase III; it was the 1980s) and writing the user interfaces for searching. This was a pre-SQL database for you youngins in case you have no idea what I'm talking about.
  • pabs3 12 hours ago
    The dbase.com domain now appears to be down too.
  • Jemm 6 hours ago
    I worked in dBase professionally for a long time. Loved the system but the database constantly needing repair was an issue.
    • whartung 1 hour ago
      Switching from a "raw" b-tree system to an SQL system with transactions was an eye opening experience.

      The typical troubleshooting path with the b-tree system almost inevitably, and very quickly, led to a "rebuild the indexes" process which no one enjoyed.

      The transactions on the SQL system pretty much eliminated that error path completely. Only actual on disk file corruption would lead to trouble. We could always (and did) post wrong data to the DB, but the DB did what it was told (right or wrong), but having the indexes lose sync with the as records was never a problem.

      Its hard to describe how refreshing that was.

      I'm sure there were b-tree systems with in built transaction systems to keep the base rows in sync with the indexes, but they were pretty late to the game and the SQL DBs started taking over.

  • rimliu 5 hours ago
    dBase was the first DB system I've ever touched. It was on YAMAH MSX, not sure how fateful that port was. Fun anyway.
  • TMWNN 12 hours ago
    What the article describes doesn't apply to those who migrated already to FoxPro (and/or Harbour), yes?
  • cyberax 14 hours ago
    One of my first sizeable projects was a COM-compatible compiled language with .dbf support primitives for data transformation. As a unique quirk, it could even work on Novel Netware to interface with Btrieve.

    Netware supported loading PE executables, but it lacked memory protection so developing for it was... fun.

    The .dbf format was pretty straightforward, though.