A Glitch That Was Always in the Cards
By Rajiv Chandrasekaran
Washington Post Staff Writer
Sunday, August 2, 1998; Page A24 Decades ago, when computer operators wore white lab coats and wrote programs on punch-card machines, space was a very precious commodity. A card might have room for only a few dozen characters of information. So programmers consciously decided to save two characters by expressing the year with only the final two digits, presuming that the first two were 1 and 9.
Some programmers realized the shortcut wouldn't work in the new millennium the computers would recognize "00" as 1900, not 2000 but they figured their creations would be obsolete well before the century's end.
It didn't turn out that way. Software began to evolve, almost like life forms. Old programs often provided the foundation for new software, a practice that resulted in various early programming conventions, including the two-digit year-dating system, being passed down through the generations.
Even as cardboard "Hollerith cards" gave way to magnetic storage devices, the cost of memory remained prohibitively expensive through the 1970s and early 1980s. That led many programmers to deliberately continue the two-digit practice.
It wasn't until the late 1980s that the problem really began to loom. Yet many computer experts preferred to temporize, believing that a fix would be found or that systems would be replaced.
"Without spending much time considering it, the software writers continued to think, 'Well, we'll soon replace this, and if we fix the numbers, well, then we'll have to go back and fix it all over the place,' " Vice President Gore said in a speech earlier this month.
In hindsight, most experts believe that the money saved by programmers over the years with the two-digit convention doesn't come close to covering the cost of today's repairs. "It saved millions of dollars," Gore said, "but it also created one whale of a problem."
© Copyright 1998 The Washington Post Company