Your current web browser is outdated. For best viewing experience, please consider upgrading to the latest version.

Donation - Other Level

Please use the quantity box to donate any amount you wish. Sign Up to Donate


Send a question or comment using the form below. This message may be routed through support staff.

Email Article

Password Reset Request


Add a topic or expert to your feed.


Follow Experts & Topics

Stay on top of our work by selecting topics and experts of interest.

On The Ground
Main Error Mesage Here
More detailed message would go here to provide context for the user and how to proceed
Main Error Mesage Here
More detailed message would go here to provide context for the user and how to proceed

Manhattan Institute

Close Nav
Share this commentary on Close

The Bug That Didn't Bark


The Bug That Didn't Bark

January 4, 2000
Energy & EnvironmentTechnology / Infrastructure

Call it the anchorman law of technological catastrophe: If Peter Jennings can see it coming, then it isn’t. Millions of computers proved as much when their date registers rolled uneventfully into 2000, but this is a very general law, one that applies to nuclear reactors, cruise missiles, hydroelectric dams, jumbo jets, heart valves and genetically altered strawberries.

As dire techno-predictions go, Y2K was notable because it was precise enough to be proved altogether wrong, on a date certain. The anchorman told us just what was going to happen to our lights and phones, and when. And, right on schedule, it didn’t. But a conscientious anchorman can perform as well with weaker stories on looser deadlines, and he routinely does. I pick on anchormen, but many of us do the same thing at the office water cooler or a cocktail party: make short, sonorous pronouncements on technical subjects we little understand.

Why do such prognostications so reliably miss the mark? The sound bite, to begin with. The anchorman (or his off-the-air mimic) has to explain a snafu’s cause and likely effect in two minutes flat. But anything that’s real and explainable in so few words is fixable by 10,000 biochemists, engineers or programmers at Merck, Monsanto, or Microsoft. If the anchorman can not only grasp the essence of a technical problem but convey it to millions of people who confuse a CD-ROM tray with a cup holder, then Bill Gates or Boeing can get a grip on it too.

High self-esteem helps as well. The anchorman is pleased to suppose that he discerns important things that smug techies have missed. Sure, they can etch a million gates on a sliver of silicon, or light up a city with a cupful of uranium, but the anchorman grasps big things, like the calendar and the weather and how the fabric of industrial society might tear apart as early as next week. Calendar and weather can get tedious, though, and ordinary folk seem to hold smug techies in high regard these days. How pleasant for the anchorman to be able to report that they’ve botched something so simple, which is going to cause ordinary folk no end of trouble.

Pleasant, but quite reliably wrong. Engineers smart enough to build real, useful systems tend to be disciplined thinkers and comprehensive planners. And in the large teams in which they typically work, they’re far better at peering over the horizon than a well-coiffed man with a TelePrompTer. By the time the anchorman discerns technological collapse out ahead, it’s safe to assume that the fix is already in, if one is needed at all.

Market forces overwhelmingly favor the fixers. People with money are forever scheming to postpone catastrophe until after their stock options vest. A terrible dread must have gripped all anchormen to the west when the lights stayed ablaze at midnight Friday in Moscow. Granted, the old Soviets had stolen all their computer technology from the U.S., but still, if the Y2K bug couldn’t make it there, in the rubble of a centrally planned economy, it certainly wasn’t going to make it in New York. That would have cost Wall Street real money. No avaricious capitalist was going to lose all his capital over some pesky defect in code.

To be sure, accidents still happen. Unpredictable ones. Grids black out; shuttles explode; planes crash; tankers founder. But if the anchorman correctly describes such disasters, he does so after the O-ring fails. When he makes the call ahead of time, he either makes it dead wrong or he makes it so vague that it covers all O-rings in every time and place, along with the cylinder gasket on cousin Vinny’s Buick.

Anchormen didn’t invent the art of appearing to make concrete calls about high-tech failure without really doing so. For this innovation credit doleful sociologists, talk-show catastrophists and op-ed worrywarts. They keep their timing vague, their event horizons distant and their details as solid as pudding. Their pronouncements bring to mind the irascible physicist Wolfgang Pauli, who once remarked of a manuscript he’d been sent to review: “This paper isn’t even good enough to be wrong.”

In his own defense, the anchorman may insist that he just reports what others foretell, but he’s being too modest. He picks from a vast and varied catalogue of predicted disaster, and makes clear what most professional oracles take great pains to muddy. He’s the one who transforms rope-a-dope punditry into knockout news. When it comes to the concise, lucid and momentously wrong prophecy of high-tech doom, none can match him.

Give the Y2K crowd its due: Its story was specific enough to be proved wrong. It concerned a problem that wasn’t altogether imaginary. Old code, with two-digit dates, really did have to be fixed. It was. In free countries with free markets, technical problems usually do get fixed, well before the famine arrives, the fuel runs out, the government defaults, the Web crashes, the pandemic is unleashed or the ice cap melts. And that’s the way it was, Saturday, Jan. 1, 1900.