Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

14
  • 7
    The DEC BLISS compilers got to be pretty good; somewhere around 1981, I ran a test once where I rewrite some of my own MACRO-32 in BLISS-32, and the code was only 1.7 times larger, which I thought was fair enough. I read an interview with Ritchie in which he said that if DEC would have let him have the BLISS-11 compiler, they would not have needed to invent C. But DEC was pretty close-fisted with BLISS. Commented Sep 12, 2020 at 23:08
  • 3
    As a counterpoint, I know that at least one Modula-2 compiler from the 1980s (for the ARM) was excruciatingly bad. On a CPU which had an integer multiply instruction and a rich register set, it would emit 20 instructions followed by a subroutine call to do an integer multiply. This was a big factor in the selection of "Arthur" (written directly in assembler) rather than its more sophisticated competitor for the Acorn Archimedes; Arthur could do things in 256KB that the other one couldn't do in 4MB. Commented Sep 13, 2020 at 7:35
  • 2
    @Chromatix And it cost how much and wanted what resources? Those were the days in which if you were serious you bought yourself a Logitech compiler hosted on a VAX... none of this PC 640K crap. Which leads to the extreme case of Stallman's observation on the Pastel compiler: it built (and presumably optimised) the entire parse tree in memory before generating code. In the middle is the interesting case of Tree Meta, where the syntax equations contained an explicit directive to specify the point at which the tree should be "unparsed" to machine code. Commented Sep 13, 2020 at 9:06
  • @MarkMorganLloyd Cost was a real factor here - a 4MB machine was simply unaffordable at that time (circa 1986) though it could be built, and the way it swapped incessantly under even a light load meant users would not be getting what they paid for. "Arthur" - short for "A RISC operating system by Thursday" - did more in practice with far fewer resources, and allowed useful, fast programs to be written in interpreted BASIC. And you can run an updated version of it on a Raspberry Pi. Commented Sep 13, 2020 at 10:25
  • 2
    @Chromatix Exactly. But I think my reply still stands: provided that you had the resources, you could have an efficient compiler in the mid-70s. I'd suggest that even with the improved techniques available today, if you built a machine with 64K you'd be hard pushed to improve on the efficacy of something like CP/M Turbo Pascal. Commented Sep 13, 2020 at 10:53