My colleague Iain Murray informs me that, in the name of eliminating time-wasting government paperwork, the Office of Management and Budget is no longer going to require federal agencies to report on their vulnerability to the Y2K computer bug. As an example of zombie government, there’s perhaps no better example than expecting regular reports on a software glitch that, by definition, ceased to be a real concern over 17 years ago. Better late than never.
The Y2K scare and the predicted millennial catastrophe that never happened is also the topic of Freethink Media’s most recent installment of their “Wrong!” video series about high-profile expert predictions that never quite panned out:
In the months and days leading up to the year 2000, many grew alarmed that a computer bug would collapse networks and bring down economies and global stability in its wake. Did we narrowly avoid the apocalypse because of some world-saving last minute de-bugging? Or was the worldwide panic just way off base?
For a more clear-eyed view of the technology challenges we faced around the turn of the century, look through the CEI High-Tech Briefing Book (2001). As CEI founder and then-president Fred L. Smith, Jr. wrote in the introduction:
Regulating new technologies is especially risky. The high-tech sector is an evolving frontier. Frontiers are always a bit chaotic, and it is inherent in their nature that mistakes, abuses, confusion, and irritation will occur. These trigger emotional cries of “There oughta be a law,” but quick-triggered reactions to novel situations rarely create good law or policy. It is far better to let the innovation and creativity of the marketplace work out the problems, and to resort to regulation only after experience shows that a problem exists for which regulation is the appropriate answer, and the nature of the necessary regulation is clear.
Still good advice today.