Catastrophes Caused by Programming Language Defects
What's covered: documented cases where
precision loss, underflow,
overflow, division by zero,
null pointer dereference/NPE, and
stack overflow led to deaths, injuries, aviation
accidents, loss of spacecraft, military equipment failures, power
plant/grid disruptions, stock exchange crashes, and blockchain
incidents.
What's NOT included: missing characters in commands (Mariner 1,
Phobos 1), gotofail SSL, race conditions without an arithmetic
component, unit conversion errors (Mars Climate Orbiter), sensor
failures without overflow. These cases are well-known but belong to
other defect classes and therefore don't fit the Awsum argument.
Purpose: justifying the value of Awsum, in which the listed
defect classes are eliminated at the language level.
6.11. Systematics of blockchain cases
Solidity prior to 0.8.0 (December 2020)
did not perform overflow checks by default
— programmers had to use SafeMath manually. After 0.8.0 the language
reverts on overflow, but
unchecked { ... } blocks, forks of legacy code, and
division-heavy DeFi math (rounding / precision-loss bugs in
share-token, exchange-rate, and fee calculations) continue to
produce vulnerabilities of the same language-level arithmetic class.
A 2025 systematic review (Rezaei et al., "SoK: Root Cause of $1
Billion Loss in Smart Contract Real-World Attacks",
arXiv:2507.20175) catalogs arithmetic over/underflow and precision
loss among the 24 active
(still-exploitable) vulnerability classes in 50 major attacks
2022–2025. The paper does not break the $1.09B total down by class;
for the named arithmetic-class incidents in its catalogue (cases
6.5–6.9 above), publicly reported losses sum to ~$46M, a lower bound that excludes the pre-Solidity Bitcoin case (6.1),
the pre-0.8.0 Solidity cases (6.2–6.4), and the long tail of
unaudited contracts. Case 6.10 (1inch Fusion v1, ~$5M) sits adjacent
to this set: the arithmetic underflow there appeared as a step in a
wider buffer/calldata-corruption attack rather than as the primary
defect, so its loss is documented separately and not included in the
$46M arithmetic-primary sum.
In 2009, at the QCon conference, Sir Tony Hoare (creator of Quicksort,
1980 Turing Award laureate) publicly apologized for inventing the null
reference in ALGOL W in 1965:
"I call it my billion-dollar mistake. It was the invention of the null
reference in 1965. (…) I couldn't resist the temptation to put in a
null reference, simply because it was so easy to implement. This has
led to innumerable errors, vulnerabilities, and system crashes, which
have probably caused a billion dollars of pain and damage in the last
forty years."
Hoare himself acknowledged that "billion" is a lower bound: he
estimated the actual cumulative damage from NPE/NullPointerException
in Java, NullReferenceException in .NET, and segfaults in C/C++ over
60 years as "somewhere between $100 million and $10 billion." Modern
estimates are an order of magnitude larger.
Awsum eliminates null as a value of a reference type:
there is no "empty" state for a reference type — absence is
represented by an explicit constructor (Maybe,
Either). This closes the classical NPE / segfault.
It is worth being precise about what this does not cover in
other languages often grouped together for this property. Haskell
still has undefined / error inhabiting every
type; Rust has panic! / .unwrap(); Swift has
the ! force-unwrap and implicitly-unwrapped optionals; F#
inherits null from .NET interop on every reference type
imported from the BCL or C#; Idris has believe_me and is
partial-by-default unless totality is explicitly enforced. Each of
these is the same class of bug Hoare was complaining about — "a value
the type admits, but the program crashes on" — under a different name.
Awsum has no language construct for a bottom value: no
undefined, no error, no panic,
no force-unwrap. Combined with mandatory exhaustiveness in
case (a missing constructor is a compile error, not a
runtime exception), this means "a value of type T that
crashes the program" does not exist — not just for null, but for the
wider family.
The incidents above demonstrate that
runtime defects in numerical and reference arithmetic continue to
kill people and burn billions of dollars 60+ years after these
defect classes were first identified.
The standard "industry" answer is external tooling: SafeMath in
Solidity, MISRA-C for embedded systems, static analyzers, formal
verification. All of these tools are
external to the language. They are optional, require
discipline, and every missed SafeMath call is a potential
BatchOverflow.
Awsum closes this gap through:
-
No null as a valid value of a reference type —
Hoare's mistake closed at the type-system level. Optional values
must be
Maybe; the compiler refuses to compile code
that doesn't handle the absence case, and exhaustiveness is checked
without a catch-all _ escape. null,
nil, undefined, and NaN are
not values of any type in the language.
-
Honest integer arithmetic via
Either
— closing the Ariane 5 / Boeing 787 / Patriot / Vancouver /
Therac-25 / BatchOverflow class. Every integer operation that can
leave its range — overflow, underflow, precision loss, division by
zero — returns Either with a typed error variant. The
programmer must handle the error case explicitly; ignoring it is a
compile error. No silent wrap, no Infinity, no
-0. The same byte-identical behaviour on JVM, CLR,
JavaScript, WebAssembly, and LLVM regardless of the host platform's
native integer semantics.
-
Stack-safe recursion of any shape — closing the
unbounded-recursion class. Self, mutual, and non-tail recursion
compile to a bounded-stack loop on every backend, including JVM and
JS where the host platform has no native cross-method tail calls. No
tailRecM, no manual trampolines, no "stack-safe in our
test suite, blows up in production on the JVM."
-
Platform effects gated at compile time — closing
the runtime-surprise class. Each effect is bound to the
program-types it can run under; using a browser-only effect in a CLI
program (or vice versa) is a compile error, not a 3 AM page.
-
No silent decisions — closing the broader "I
thought this value had a different type" / "I thought I was
modifying the outer x" class. No defaulting of integer literals (the
compiler never picks a type for the user), no implicit numeric
coercion, no shadowing of any visible name by an inner binder.
Closed sum types make every "what can this value be?" answerable
from the type signature alone.
Each of these guarantees
moves the defect from runtime to compile-time — for
every line of application code, on every supported backend.