I'm not sure if this has already been explained somewhere else, but I'll explain it again anyway. Basically, people thought Y2K was going to wreak massive havoc because most computers stored years in two-digit form. People thought all kinds of explosions would happen when "100" tried to occupy a two-digit space. I am writing this article to give a calm look at what problems could have occurred.
First, and obviously, "100" is more than two digits. I'm guessing most systems would have either kept the first two ("10") or the last two ("00"). This would have resulted in some noodliness with date comparisons, but nothing world-ending. There might have been slightly more serious issues in business applications that recorded trends over continuous periods of time.
A rare problem case involves a hypothetical application that uses an unsigned byte two-digit year as an array index. Frequently when using arrays, one must access the previous element (n - 1). Doing so for year 00 would attempt to access array index -1, which was always a fatal error for the systems of that time. Such an application would crash hard when the century turned, possibly corrupting data but not causing all nuclear warheads to spontaneously launch.
The last situation I can think of is a punch card reader (maybe kind of like the ones that read your ACT bubbles) that has two columns of digits for the year. People (especially very old people) would have a bad time recording their date of birth. The machine might also think the writer means 19nn and there would be no way to determine which century the year belongs to. This was remedied very easily by adding an adjacent true-false bubble for "is 21st century?"
So, there really was no danger of appalling disaster as a result of computers' misunderstanding of the new century. I'm not sure how the rumor got started, but it was definitely told melodramatically.
A question really. How is 100 more than three digits?
ReplyDeleteAnything is possible when the article's author is semi-coherent. Fixed. :)
Delete