As people planned for this New Year's celebration, most did not stock up on canned goods, take out large amounts of cash from the bank or purchase a backup generator. But 20 years ago, excitement for the start of the year 2000 was mixed with fear that the rollover to 2000 might cause computer systems to fail globally, with potentially apocalyptic consequences.
Year dates had been entered as two digits — e.g. "99" — and the rollover to "00" might cause catastrophic failures: Would the lights stay on? Would the banks fail? Would planes fall out of the sky? These questions — the Y2K bug — overshadowed the transition to the new millennium.
But as the Washington Post announced in a Jan. 1, 2000, headline: "Y2K bug has no bite so far." The millennium bug had been squashed.
To the extent that Y2K is remembered today, it is largely as something of a joke: a massive techno-panic stoked by the media that ultimately amounted to nothing. Yet, avoiding catastrophe was the result of serious hard work. As John Koskinen, the chairman of President Bill Clinton's Council on Y2K, testified in the early weeks of 2000, "I don't know of a single person working on Y2K who thinks that they did not confront and avoid a major risk of systemic failure."
That danger was averted was thanks to a group of experts recognizing a problem, bringing it to the attention of those in power — and those in power actually listening to experts. Twenty years later, we are able to look at Y2K with derision, not because Y2K was a hoax, but because concerned people took the threat seriously and did something about it.
It's a lesson for addressing myriad problems today.
The origins of the Y2K problem seem almost quaint at a time when people regularly carry hundreds of gigabytes of computing power in their pockets. But in the 1960s, computer memory was limited and expensive. Therefore, programmers chose to represent dates using six characters rather than eight, meaning the date Oct. 22, 1965, was rendered as 102265. This format worked for decades, saving valuable memory and speeding up processors. Insofar as it represented a potential risk, it was a problem for the future. And thus, even though computer scientist Robert Bemer tried to sound the alarm in 1971, it seemed like there was plenty of time to fix the two-digit date problem.
Computers assumed that the first two digits of any year was "19" — and they began running into problems once they started encountering dates that occurred after Dec. 31, 1999. These problems ranged from the seemingly comical (a 104-year-old woman being told to report to kindergarten), to the frustrating (credit cards with expiration dates of "00" were denied as having expired in 1900), to the potentially horrifying (the risk that essential infrastructure could fail).