Two Catastrophic Failures Caused by “Obvious” Assumptions
Both incidents involve smart people doing reasonable things and systems behaving exactly as designed.
- Mars Climate Orbiter (1999): lost because one team used Imperial units and the other used Metric.
- Citibank $500M error (2020): a routine interest payment turned into a principal transfer due to ambiguous UI labels.
The problem wasn’t complexity but "meaning" that existed only in people’s heads.
This is a breakdown of how assumptions turn into catastrophic technical debt.
submitted by /u/Vast-Drawing-98
[link] [comments]