April 25th, 2018 16 Minute Read Public Filings by Mark P. Mills

Testimony before the U.S. Joint Economic Committee on Innovation

Mark Mills testifies before the U.S. Congress Joint Economic Committee in a hearing entitled “How the Innovation Economy Leads to Growth.” 

(Hearing begins around 21:00.)

______________________

"Thank you Chairman Paulsen and members of the Committee for the opportunity to testify on this important issue.

Over my career, I have had the good fortune of working in each of the four corners of the innovation economy’s ecosystem. Early in my career I was a practicing innovator and earned several patents as semiconductor engineer and later as a scientist in optical communications. I was introduced to the interstices of innovation policy as a young staffer in President Reagan’s White House Science Office. And today I’m engaged in the other two aspects, in finance with a tech venture fund and as an analyst.

I mention these four parts of the innovation economy to note that there is – or at least there used to be -- a common thread that ran through all of them, which is that far more innovation lies in our future. That conviction is no longer accepted by some analysts and academics. This divergence has important implications because what we believe about the future directly impacts planning and policy decisions being made today.

In dispute is not whether more innovation in general is coming, but whether or not the innovation on the horizon is truly significant; i.e., significant, enough to re-animate the kind of economic growth we have experienced in the past. If one accepts the proposition that innovation is now yielding merely incremental advances over current practices and products, or that it is dominated mainly by such things as better apps and entertainment, then one logically reaches the pernicious conclusion that we are in a mature economy that must accept a so-called “new normal” of far slower economic growth.

Policymaking under the new normalist paradigm logically becomes a kind of de facto palliative care for an ostensibly mature late-stage society.

The new normalists don’t propose that technology will stop causing disruptions similar to those we’re witnessing now around, for example, social media and the Internet’s impact on politics and culture.  The new normalists suggest instead that the disruptive features of technology are a kind of froth on top of a new paradigm of permanently slower overall economic growth. As evidence, the new normalists make the points that over the past decade or so, GDP growth has been anemic and, more important, a critical underlying driver of the economy, U.S. productivity growth, has been low and stagnant for nearly 15 years.

The closest economists get to having a law of physics is in the truism that increasing productivity is the primary force driving economic growth. An enormous body of scholarship has been devoted to studying productivity: Providing a coherent theory around productivity, technology and growth earned Robert Solow a 1987 Nobel Prize. Absent foundational innovations, there is no prospect for a return to higher productivity growth. And without that, America does face a dismal economic future.

The problem, however, with the thesis that America is facing a new age of secular stagnation is that its adherents misread the implications of the recent record of slow productivity growth. Set aside important co-factors that can suppress innovation (especially unfriendly tax and regulatory policies). The primary reason for recent lagging productivity growth is that we have been living through an interregnum between great technological cycles.  Radical changes in technology don’t emerge in convenient continual steps, but instead burst forth episodically.

History offers many examples of the episodic character of innovation at the scales that move economies. The underlying technological driving forces always seem obvious in hindsight, but are rarely anticipated in advance by economists and forecasters.

In order to illustrate history’s episodic pattern for foundational innovation, consider a recent example in energy domains. Then we can look for a similar underlying pattern in two other domains where revolutions currently seem absent: manufacturing and healthcare. These latter two sectors constitute 30% of the American economy.

The essence of the policymaker’s dilemma when it comes to making plans that depend on assumptions about productivity is that those assumptions are necessarily based on forecasts about technology. I will resist the temptation to dwell on that fact that when it comes to forecasting, the track record from most pundits and especially economists is dismal at best.

The quasi-profession of forecasting may be a dubious science, but it is a serious business nonetheless. A particularly relevant aphorism about forecasting originated with physics Nobelist Dennis Gabor who wrote in 1963: “The future cannot be predicted, but can be invented.”

But back to energy: as analysts and policymakers around the world now know, we have recently experienced radical technological progress in energy technologies.

Last year marked the 40th anniversary of legislation establishing the Department of Energy. Its core mission was to find technologies to replace oil and natural gas, and to reduce the use of both those fuels. For decades the accepted wisdom was that there was no prospect for technologies that could affordably produce hydrocarbons at the scale society would need in the future. In other words, in economic terminology: hydrocarbon technology productivity had stalled out.

We now know that dismal forecast was wrong. A new technological approach, unlocking hydrocarbons in America’s vast shale fields, turned out to be astonishingly productive. Those on the front lines of that revolution were rarely, if ever, visible in the public and policy discourse.  All eyes were on the forecasts of technology alternatives favored by the DOE and others. Meanwhile, over the past decade alone, U.S. shale technology has delivered the fastest and biggest addition to world energy supplies that has occurred in history, anywhere and from any energy source. Shale oil and gas added 2000% more to U.S. energy supply over the past decade alone than have solar and wind combined.

But the energy technology forecasts of yesteryear led to a cumulative $500 billion in government spending over four decades in the pursuit of technologies to replace hydrocarbons. Biofuels production did grow, rising from 0.1% of America’s energy supply in 1977 to about a 2% now. Similarly, the combined energy production from solar and wind also rose from near zero in 1977 to about 3% of today’s total U.S. energy supply. Meanwhile oil and gas meet nearly 70% of U.S. energy demand.

I note that those few forecasters who anticipated what would actually happen were at that time generally ignored or viewed as engaged in “old think” or in the “pockets” of entrenched industries. (As some on this Committee know, my written record shows that I was counted amongst those in that history’s minority.)

In getting technology forecasts wrong circa 1977, the economists and energy pundits then were in good company. Back in the 1970s, economists were also puzzled by an overall productivity collapse similar to the one we have recently experienced. There were many forecasters back then deeply worried about economic stagnation – and even the dreaded “stagflation” of inflationary pressures occurring simultaneously. It is instructive to note, however, that the 1976 economic report to Congress by the Council of Economic Advisers, chaired by Alan Greenspan, did not contain the word “computer.” Missing the computer revolution in economic forecasts at that time was understandable, but it was no small error.

The lessons one should derive from the history of energy technology are two-fold. The first is that noisy public debate and aspirational forecasts can hide the real underlying trends.  The second is that what appears to be an end to innovation is often a pause between eras as engineers and industries perfect and begin to adopt new foundational technologies.

The question now is what predictive technological “signals” are we missing today, signals hidden in the media “noise” about the demise of manufacturing and the inevitability of cripplingly higher healthcare costs. The energy sector is important, but manufacturing and healthcare are, respectively, 1.5-fold and 2-fold bigger parts of the economy.

With regard to manufacturing, the current narrative is that productivity gains are nearly maxed out and more automation will merely add efficiency that will displace more workers in a declining domain. And with regard to healthcare, a different manifestation of technological pessimism is inherent in the forecast that consumer ‘demand’ for healthcare will grow far faster than the efficacy or – again in economic terms -- the productivity of healthcare services.

In both cases, today’s pessimists are mistaking, again, an interregnum between technological eras as evidence of stagnation in foundational technology innovations.

Start with manufacturing.  The idea that a modern nation’s share of GDP and employment in manufacturing will necessarily decline is negated by the examples of Germany and Japan which have not experienced the sharp declines seen in America. Evidence points to the decline in U.S. manufacturing over the past decade coming in large measure from the dual insult of high taxes and a huge increase in the regulatory state.

At the same time it has been fashionable to blame automation for a decline in manufacturing employment. But here it’s important to note that the data show manufacturers’ overall spending on information technology has actually been flat or even decreased over the past decade. IT spending as a share of revenue in manufacturing is only one-fourth that seen in the information-centric sectors: media, banking, education and insurance. The real challenge for manufacturing is that it is still under-invested in IT, and has yet to sufficiently adopt new productivity-driving technologies.

But sensors, computers and communications have finally improved enough to meet the far more demanding metrics of the industrial world, as compared to the information-centric domains -- social media, news, entertainment, finance, etc. -- where info-tech has made its greatest gains so far. Excitement is finally starting to build in some corners of Silicon Valley about bringing information tools into the manufacturing sector to make everything “smarter” and more efficient.  That will happen, but arguably even more important are the contemporaneous ‘hidden’ revolutions in new kinds of manufacturing machines, and radically new kinds of materials.

A materials revolution is emerging akin to the dawn of the age of chemistry a century ago. The use of a high-performance computing combined with the so-called “materials genome” is ushering in an era of computationally designed materials. Not only will such things as new classes of ultra-high-strength and lightweight materials emerge, but also entirely new materials that enable biocompatible (even consumable) sensors and computers, and the commercialization of so-called metamaterials. The latter exhibit properties that don’t exist naturally and unlock the ability to create entirely novel kinds of products.

Along with the materials and industrial-information revolutions, we are seeing the maturation of radically new kinds of manufacturing machines. For example, the commercialization of 3D printers will enable a kind of manufacturing that could best be termed “mass customization” rather than just mass production.  3D printers also allow the fabrication of components and devices impossible with conventional machines.

At the same time, we are also seeing industrial robots finally emerge that can take on truly complex or highly variable tasks. Up until now, robots have been deployed primarily in a few industrial sectors, dominatingly automotive where the tasks are relatively simple and repetitive. Other industrial sectors will soon gain robot-driven productivity benefits as industrial robots, which can tackle more complex and varied tasks -- especially so-called “cobots” which work safely and intuitively alongside humans – now begin to emerge.

These technological trends will accelerate the shift of manufacturing away from low-cost labor to high-skilled labor and high-value markets. Improving American manufacturing competitiveness could not come at a better time. The conventional wisdom that automation will offer economic growth but reduce industrial employment is offset by the magnitude of the looming demand for manufactured goods just about to emerge.

The fact is the global demand for manufactured goods is on the cusp of the greatest expansion in history. The world’s GDP is forecast to expand by nearly twice as much over the next 20 years as it did in the past 20. This means at least twice the growth in demand for everything from cars and aircraft, to tractors and chemicals, to clothes and computers. Rising productivity means, by definition, greater competitiveness; and those countries that make these leaps will enjoy precisely the same benefits that productivity gains have yielded throughout all of history: more economic growth and more jobs.

Turning now to healthcare, the underlying technological patterns are similar to those in manufacturing.

Start with the fact that healthcare productivity, measured in economists’ terms again, i.e., value added per labor-hour, has been flat for 15 years.  The absence of progress in labor productivity is precisely why costs are rising as demand increases. Other than rationing, technology innovation is the only path to lower-cost and more high-quality healthcare.

Information systems can add valuable efficiency to administering healthcare services, or the management of records and insurance. But what is really required is a kind of foundational progress in the efficacy – i.e., productivity -- of diagnostics and therapeutics.

In healthcare we are also at an interregnum since the key enabling technologies are relatively new and take time to mature and be fully absorbed within the ecosystem. Qualification takes time when it comes to hardware and humans. And, as with manufacturing, healthcare domains are just now seeing the practical emergence of new kinds of materials and new kinds of machines against a backdrop of profoundly more powerful computing.

It is well known that accurate and quick diagnosis is one of the critical aspects of healthcare. Here we see the prospect for both radical advances in efficacy as well as the democratization of diagnostic tools arising from new materials, new communications and high-performance computing. Diagnosis starts, of course, with obtaining critical data.

We are about to see explosive growth in access to biological information because of the rapidly evolving and new field of bioelectronics. We are well along the path to commercial bioelectronics that are body-compatible, implantable and even digestible. Once widely deployed, bioelectronics will rival in scale the traditional silicon electronics industry and offer a tsunami of heretofore unavailable data. The FDA has already approved a number of the key components.

Rather than inserting instruments or indirectly or episodically measuring various biological states, wireless bioelectronics can directly monitor conditions continuously. For post-operative monitoring, for example, these new materials allow embedded infection-monitoring sensors that eventually dissolve just as stitches do, and allowing a kind of monitoring heretofore impossible thereby reducing patient complications and risk, as well as lowering both direct costs and the risks of later complications and indirect costs.

While it takes time for FDA approval of intrusive technologies of any kind, consider in the meantime the easier-to-deploy sub-class of external bio-compatible wearables (e.g., bandaids as sensors) that are already on the way to becoming a multi-billion-dollar industry with far-reaching potential for healthcare ‘productivity’. Apple, for example, is well aware of the fact that features inherent in, or that can be added to, an iPhone constitute implicit if not explicit classification as a medical device. There are profound implications to the prospect of nearly every citizen possessing a useful diagnostic device.

But coming faster are advances in professional diagnostic tools, both those in the laboratory and those on the front lines of healthcare. These new devices are made possible by precisely the same suite of sensors, CPUs, communications, and materials technologies that are spreading throughout industrial ecosystems. The recent XPrize award for a portable diagnostic tool provides a dramatic example of the emergence of diagnostic devices that were recently only the stuff of science fiction.

In 2012 Qualcomm, a company better known in IT rather than medical circles, partnered with the XPrize Foundation to offer $5 million for a team able to emulate the “Star Trek” tricorder.  For those not SF cognoscenti, the spaceship’s doctor would wave a handheld tricorder over a patient to obtain immediate diagnoses. That notional prize – “to develop developed a mobile device able to diagnose 13 health conditions while continuously monitoring five vital signs” – was awarded a year ago to a Pennsylvania startup. That XPrize and the proliferation of smartphone health apps and tools are emblematic of deep secular shift emerging in medical diagnostic technologies. 

Then there is the promise of genetic engineering. This domain too is fundamentally information-centric using rapidly advancing classes of gene-mapping machines and high-performance computing that are becoming ever less expensive. It is no longer science fiction, even though it is early days, to think about the idea that algorithms could develop new drugs or simulate preliminary field trials, even clinical trials that are hyper-personalized.

Similarly, a new discipline is emerging around the potential to emulate a trend that started in industrial domains where one could create a digital twin for an individual (or a machine or process). As it becomes easier and cheaper to obtain real-time information about an individual’s health and biological conditions, that information can be used by a computer model of that individual to assess and even diagnose health conditions in real time. While that possibility is still in the future, the diagnostic and information tools that will ultimately lead there are already starting to become practical.

And all of this says nothing about other revolutionary healthcare technologies ‘hidden’ in the technical literature today. Of particular interest is the emerging class of practical robots, in particular cobots I mentioned earlier, that work collaboratively with people. Surgical cobots such as the Da Vinci have been around for a number of years; but many more are coming, unlocking far more potential for hyper-precise and minimally invasive surgery. Cobots will be particularly helpful in eldercare and rehabilitation.  FDA recently approved, for example, a cobot in the form of a wearable exoskeleton for more effective ambulatory rehabilitation.

We are, in short, on the cusp of technology-driven “productivity” gains in healthcare that are unprecedented in history. These gains will come from tools and techniques that we know are undergoing rapid improvement and whose costs are declining. By definition, they epitomize precisely what is inherent in the definition of productivity – more output at lower costs. 

Using technology to reduce or amplify human labor has been a central pursuit of humanity for all of recorded history. Productivity is central to economic progress. As economic historian Joel Mokyr has pointed out, technological innovation gives society the closest thing there is to a “free lunch.” From the dawn of the industrial revolution, it has enabled the near-magical increase in the availability of food, fuel and many products.

Today we stand at the beginning of epoch-changing shifts in technologies relating to both manufacturing and healthcare. As history shows, such advances have never been predicted by economists. Instead they’ve been invented and propelled by innovators. We should look for evidence of the next great cycle of foundational innovation in the ‘hidden’ domains where innovators work, not where pundits and the media prognosticate.

______________________

Mark P. Mills is a senior fellow at the Manhattan Institute and a faculty fellow at Northwestern University’s McCormick School of Engineering. In 2016, he was named “Energy Writer of the Year” by the American Energy Society. Follow him on Twitter here.

Donate

Are you interested in supporting the Manhattan Institute’s public-interest research and journalism? As a 501(c)(3) nonprofit, donations in support of MI and its scholars’ work are fully tax-deductible as provided by law (EIN #13-2912529).