View all Articles
Commentary By Mark P. Mills

The Policymaker's Dilemma: Believe Economists or Innovators?

Economics, Energy Technology

The economy is picking up a little steam. But it’s still a far cry from the go-go growth fondly remembered from the Reagan and Clinton eras. And when it comes to forging policies intended to accelerate the economy, we have a clear division between two camps. On one side are the optimists, who believe much faster growth is possible. On the other are pessimists, who assert that lower economic growth is the “new normal.” 

“Don’t count on economists to see the future. Instead, focus on policies that ensure innovators are free to create it.”

This divide has become increasingly political, with the Trump administration and most Republicans pitted against Democrats and some Republicans. But the nature of this debate predates Trump and cuts across conventional political divides. In fact, this has been a critical issue engaging economists and policymakers for the nearly two-decades of slow growth experienced since the turn of the century.

This divergence has important implications because what we believe about the future directly impacts planning and policy decisions made today. For instance, the Federal Reserve recently projected that the economy will likely sputter along in the coming years at barely half the GDP growth rate the nation averaged during the last part of the 20th century. If those economists are right and this “new normal” paradigm is inevitable, then policy should be aimed at little more than palliative care for an ostensibly mature late-stage society. 

But what if they’re wrong?

In dispute is not whether we’re going to have any more innovation at all, but whether or not it will be truly significant — enough to reanimate the kind of economic growth we’ve enjoyed in the past. The closest economists get to having a law of physics is that increasing productivity is the primary driver of economic growth. And a key driver of productivity is technology, as Robert Solow showed over three decades ago (and for which he won the Nobel Prize). But, as the “new normalists” like to point out, productivity growth has been low and stagnant for some 15 years. Absent foundational innovations, there is no prospect for a return to higher productivity growth. And without that, we do face a dismal economic future indeed.

The core problem with this thesis, though, is that it misreads the recent record of slow productivity growth and its implications. The fact is we have been living through an interregnum between great technological cycles. Radical changes in technology don’t emerge in convenient continual steps but, as history makes clear, burst forth episodically. 

Thus the challenge that policymakers face when making plans that depend on assumptions about productivity is that those assumptions are themselves always based on technology forecasts. And when it comes to forecasting technology, the track record from most pundits, politicians, and (especially) economists is pretty dismal. 

But while the quasi-profession of forecasting may be a dubious science, it is a serious business nonetheless. The key to forecasting — which so many economists miss — is that, as Nobel Prize winning physicist Dennis Gabor put it in 1963, “the future cannot be predicted, but can be invented.”

Consider what has recently happened in just one domain: energy. Here we can discern a broader pattern of technological myopia that is indicative of what is wrong with today’s forecasts about other sectors of the economy, such as manufacturing and health care, which together constitute nearly one-third of the economy.

For decades the accepted wisdom was that there was no prospect for technologies that could affordably produce hydrocarbons at the scale society would need in the future. In economic terminology: Hydrocarbon technology productivity had stalled out. We now know that dismal forecast was wrong.

A new technological approach, unlocking hydrocarbons in America’s vast shale fields, turned out to be astonishingly productive. Those on the front lines of that revolution were rarely, if ever, visible in the public discourse. Instead, policymakers were seduced by aspirational forecasts for alternative energy technologies.

Meanwhile, shale technology delivered the fastest and biggest addition to world energy supply in the shortest period of time in history. In the past decade, the combined growth of oil and gas production from shale tech added 1,500 percentmore to U.S. energy supply than growth of solar and wind combined. If one counts biofuels along with solar and wind, those collectively supply barely 5 percent of U.S. energy demand now; oil and gas meet two-thirds. 

Those few forecasters who did anticipate what would actually happen were generally ignored or viewed as engaged in “old think” or in the “pockets” of entrenched industries. Meanwhile, their critics’ erroneous energy technology forecasts led to a cumulative $500 billion in government spending over four decades in the pursuit of replacements for hydrocarbons. 

The lessons one should derive from the history of energy technology are two-fold. The first is that noisy public debate and aspirational forecasts can obscure the real underlying trends. The second is that what appears to be an end to innovation is often a pause between eras as engineers and industries perfect and begin to adopt new foundational technologies.

The question now is what predictive technological “signals” are we missing today? The energy sector is an important part of the economy, but manufacturing and health care are, respectively, 150 percent and 200 percent bigger. 

With regard to manufacturing, the current narrative is that productivity gains are nearly maxed out; enhanced automation will merely add efficiency, displacing more workers in a declining domain. In health care, we see a different kind of technological pessimism, i.e., that consumer demand for health care is forecast to grow far faster than the productivity of health care services, thus driving costs into the stratosphere.

In both cases, today’s pessimists are once again mistaking a hiatus between technological eras as evidence of secular stagnation.

Consider manufacturing. The idea that there’s an inevitable collapse in manufacturing as a share of modern nations’ GDP and employment is negated by the examples of Germany and Japan. Neither has experienced the sharp declines seen in America over the past decade. Evidence points rather to the dual insults of increased regulations and high taxes as the major causes of the recent decline in U.S. manufacturing. 

It has been fashionable, however, to blame robots for taking factory jobs. The problem with this narrative is that the data show that the manufacturing sector has been under-investing in information technology and automation. IT spending as a share of revenue in manufacturing is about one-fourth that seen in the information-centric sectors of media, banking, education, and insurance. This is probably why productivity growth has been sluggish in the manufacturing sector — the opposite of what you’d expect if robots were putting people out of jobs. 

Still, IT technologies — sensors, computers, and communications — have been quietly advancing enough to finally meet the far more demanding industrial requirements. The manufacturing sector is in the early stages of a transition to making everything “smarter” and more efficient. And there is a contemporaneous “hidden” revolution in two core aspects of manufacturing: fabrication machinery and fabrication materials.

For example, the commercialization of 3D printers will enable a kind of “mass customization,” different in kind from the mass production that has characterized industry for the last century. 3D printers allow the fabrication of components impossible with conventional machines. Also making their way into the mainstream are “cobots,” robots that work safely alongside humans, which will boost productivity and also help fill the skills gap.

The materials revolution that is brewing has features similar to the dawn of the age of chemistry a century ago. And, like that previous transformation, it will enable entirely novel kinds of products. Today, supercomputing combined with the so-called “materials genome” is ushering in an era of computationally designed materials making possible new classes of ultra-high-strength and lightweight materials, and even metamaterials that exhibit properties that don’t exist in nature.

These transformations will, among other things, shift manufacturing away from low-cost to high-skilled and high-value labor markets and improve the competitiveness of American manufacturing. This could not come at a better time. Global demand for manufactured goods is about to undergo the greatest expansion in history. The world’s GDP is forecast to expand by nearly twice as much over the next 20 years as it did in the past 20. This means at least twice the growth in demand for everything from cars and aircraft, and tractors and chemicals, to clothes and computers. 

The underlying technological patterns are very similar in health care. Start with the fact that health-care productivity has been flat for 15 years, measured in terms of value-added per labor-hour. The absence of progress in labor productivity is precisely why costs are rising as demand increases. Other than rationing, technology innovation is the only path to lower-cost and high-quality health care.

Conventional IT solutions — the equivalents of AirBnB, Uber, or Amazon for health care — won’t be enough. Information systems can add valuable efficiency to administering health-care services, or the management of records and insurance. But what is really required is foundational progress in the efficacy — i.e., productivity — of diagnostics and therapeutics. 

Just as with manufacturing, the health-care sector is now seeing the emergence of new kinds of materials and sensors and machines against a backdrop of more powerful computing. This will lead to radical advances in efficiency as well as the democratization of diagnostic tools. Smartphones, for example, already have features that constitute implicit if not explicit classification of that technology as a medical diagnostic device. Imagine a world in which nearly every citizen possesses a useful and reliable diagnostic device. 

Diagnosis starts, of course, with obtaining critical biological information. The rapidly emerging field of bioelectronics will enable body-compatible, implantable, and even digestible sensors. Once widely deployed, bioelectronics will rival in scale the traditional silicon electronics industry and offer data heretofore unavailable, much of it in real time. 

“We are on the cusp of technology-driven productivity gains in both health care and manufacturing that are unprecedented.”

It will take time for the FDA to approve new technologies that are intrusive. But in the meantime, we are likely to see growth in easier-to-deploy types of external bio-compatible wearables, e.g., Band-Aids or polymer ‘tattoos’ that act as sensors. Such products are already on the way to becoming a multi-billion-dollar industry with far-reaching potential for health care productivity. 

The now infamous flame-out of diagnostic company Theranos may have been a fraud, or at least over-hyped. But the initial excitement was rooted in a reasonable belief that profoundly better diagnostic machines are possible. They are. 

Though it has received far less media attention, a recent XPrize shows that the stuff of science fiction is now becoming a reality for diagnostics. In 2012 Qualcomm sponsored a $5 million XPrize to make a real “Star Trek” tricorder. For non-trekkies, that’s the handheld device the spaceship’s doctor would over a patient to obtain an immediate diagnosis. The prize for “develop[ing] a mobile device able to diagnose 13 health conditions while continuously monitoring five vital signs” was awarded a year ago to a Pennsylvania startup. 

Then there is the promise of genetic engineering, a domain that is inherently information-centric. It is no longer science fiction, even though it’s early days, to think about algorithms that could develop hyper-personalized drugs or simulate clinical trials. And, as with manufacturing, practical cobots are about to change the landscape enabling not only hyper-precise and minimally invasive surgery, but also profoundly improve rehabilitation and eldercare. FDA recently approved a cobot, a wearable exoskeleton, for more effective ambulatory rehabilitation. 

We are on the cusp of technology-driven productivity gains in both health care and manufacturing that are unprecedented. The central lesson for policymakers? Don’t count on economists to see the future. Instead, focus on policies that ensure innovators are free to create it. 

This piece originally appeared on RealClearPolicy

______________________

Mark P. Mills is a senior fellow at the Manhattan Institute and a faculty fellow at Northwestern University’s McCormick School of Engineering. In 2016, he was named “Energy Writer of the Year” by the American Energy Society. Follow him on Twitter here.

This piece originally appeared in RealClearPolicy