12:01 a.m. Saturday morning meant a lot of things to me, but one of the immediately gratifying aspects of this New Year's Day was seeing the lights still on: Woo-hoo! We won!

None of the Y2K predictions of doom came true; the city stayed lighted, and the champagne continued to pour. I had the gleeful urge to pick up a still-functioning phone and tell the nearest bunker-dwelling, Spam-hoarding Y2Kultist where to go stuff his millennium-bug paranoia.

Now it can be told: I am profoundly happy to be rid of Y2K. I was sick of it 18 months ago; I'll be thrilled to see that abbreviation, which seemed too clever at the start, retreat into encyclopedias and quiz shows forevermore. True, I'm glad that things seem to have worked out--that is to say, computers are functioning at their usual levels of bugginess. But I'm also just glad to see the prophets of doom get caught so utterly wrong in their predictions. Y2K needed to be solved, but it didn't need to get inflamed into the political, quasi-religious subject it became in certain quarters.

Instead of a computing problem and a technology issue, it became some sort of metaphor about what kind of people we want to be, what sort of society we live in, and--a lot of the time--how we all somehow deserved to be punished. For relying too much on technology we don't understand, for spending too much time in cities, for not believing in God enough--I couldn't even keep straight what it was we were supposed to be doing wrong. Where people needed to hear about what needed to be fixed or worked around, instead we got lectures about how we were bad people for not exhibiting enough angst.

And for what end? I was never sure what people expected the average person to do about all this in the first place. If the problem was so complex that it would require the sustained attention of millions of programmers to fix, then what the heck were the lay people supposed to do about it? I mean, if the computer programmers couldn't persuade their bosses to give this project the support it deserved, I couldn't see how the powers that be would listen to a bunch of computer users.

The Y2K advocates did, however, scare the bejeezus out of a lot of people. Part of my job as an alleged expert became reassuring people, "No, I don't think the world is going to end. We may have some annoying glitches and bugs, but I really don't think we're heading into a new Dark Ages or a new Great Depression."

The funny thing is, most of the predictions of civilizations' collapse, however absurd, began with some fairly straightforward logic. Discussions about the millennium bug always seemed to turn on two axes. One, there are X many computers out there to fix (a point emphasized with the ritual invocation of the phrase "embedded systems," referring to the millions of chips running hard-coded programs squirreled away inside buildings and machines). Two, it would be mathematically impossible to fix and debug that many systems before the 1/1/200 deadline. Ergo: We're all gonna die!

This argument was backed up with some real-world proof. Consultants and programmers pointed to numerous tests that caused some unexpected, unsettling failure, such as a power plant shutting itself down when a critical controller rolled over to 1/1/00. And the history of high-priority, high-pressure computing projects led by governments or big corporations is pockmarked with dismal failures, from the IRS on down to the Montgomery County school system. In certain cases, throwing more money and manpower at a project has only slowed it down further.

So a lot of people found this argument convincing--apparently convincing enough for their arguments to start to take on the certainty and conviction of the newly converted.

"Does it make any sense to plan a three day millennium celebration on the Mall, culminating in a News Year's Eve fireworks display?" asked one local activist in July. "What can we do to have this nonsense canceled?" Pondering worst-case scenarios became a sick sort of parlor game for the more imaginative pundits and forecasters. Where would things wind up? Runs on banks? A stock-market crash? An economic depression? Food riots? Martial law?

Don't get me wrong: I don't think this was some hoax cooked up by the systems departments of the world to get everybody else to buy them some new toys. A lot of work had to be done to fix, patch or detour around these bugs; this effort forestalled some colossal unpleasantness.

But I couldn't accept the worst-case scenario, for two reasons:

1. How can you mathematically prove that something is "too difficult" or "too complex" to fix? I suspect that, using the same arguments of the true Y2K believers, you could have crafted a convincing proof of the impossibility of sending two people to the moon and back by 1970. Or of increasing the memory of computers by over 500 times from 1980 to 2000 while cutting their cost.

2. Even if the machines did malfunction, why wouldn't people be able to cope in one way or another? Computers may not be fault-tolerant, but people are: Years of dealing with marketing-driven, commercially-developed software have given a significant chunk of the world's population lots of practice at working around the limitations of computerized devices. The fixes people come up with on deadline may not be pretty or elegant (for instance, setting a computer's clock back), but they usually work. Meanwhile, the rest of the world may not excel at debugging Windows, but it does have some acquaintance with power blackouts and telephone outages. For instance, we did not observe the population of New England reduced to cannibalism during ice storms in previous winters.

The Y2K hypothesis treated people as yet another system to be analyzed, traced and debugged, with actions that could be graphed liked any other formula. We're not.

These would be things worth pondering for when the next computer-induced crisis comes around. The machines we make can do some very smart things. But we should not forget that we can still be smarter than them--nor should we forget how to act that way. The Y2K bug seems to be history, but all the other ones are here to stay.