Lunar Programming Language

by David A. Moon
January 2017 - January 2018



All Programming Languages are Wrong

Most current-day programming languages seem to be based on the idea that computation is slow, so the user and the compiler must work hard to minimize the number of instructions executed. Thus in most programming languages certain data types such as numbers are a special case which does not fit well into the general type system of the language, and hardware details such as the number of bits supported by an integer add instruction show through in the language semantics. Compromises to minimize instructions extend so far as to make familiar-looking operators like + and < behave in unintuitive ways. If as a result a program does not work correctly in some cases, it is considered to be the programmer's fault. But it is really the language designer's fault.

This made sense when Fortran and BCPL were designed in the 1950s, or when C++ was designed in the 1970s. But it makes no sense today.

It does not reflect the realities of modern hardware, where computation is almost free, memory size is almost unlimited (although programmers' ingenuity in creating bloated software apparently knows no bounds), and the principal limit to performance is the cost of communication. For example, one cache miss might take as much time as a hundred add instructions. If it does not noticeably increase the size of the data or program, quite a large amount of extra run-time computation can be added to most programs with no effect on their total running time. This computation can be invested to give the programming language a more rational semantics and to remove common sources of hard-to-find errors.

Today's languages contain far too many features that do almost the same thing but have slightly different performance characteristics, for example regular and virtual functions in C++. This just encourages programmers to waste time on micro-optimization when they could instead invest time understanding the large-scale behavior of their program and optimizing that. Furthermore excess features make programs more complex and harder to understand.

Another problem can be succinctly represented as . (the dot). Most current-day programming languages got off on the wrong foot, because they seem to be in love with the idea of "object" or "abstract data type." They over-use objects to organize and modularize everything. Furthermore, the misleading theoretical simplicity of "encapsulated" objects or abstract data types implies regarding methods as belonging to classes and having implementation entirely in a single class, which leads to method selection being based on just one argument, which is typically placed to the left of the dot. Because this does not work very well in practice, languages have to add complex kludges to get around the problems. For example, templates in C++, distinction between classes and interfaces in C# and Java (with gradual leakage of class features back into interfaces as the language evolves), and "partial classes" in C#.

How did this happen? I blame it on Algol in the 1960s and then Simula and Smalltalk, which essentially added dynamic features to Algol. From that point on, programming language design had started on the wrong foot and remains confused 50 years later.


Previous page   Table of Contents   Next page



Creative Commons License
Lunar by David A. Moon is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Please inform me if you find this useful, or use any of the ideas embedded in it.
Comments and criticisms to dave underscore moon atsign alum dot mit dot edu.