We Need To Build An Innovation Economy For The 21st Century

Image for post
Image for post
Image: pxhere

In a recent New York Times column, Bret Stephens argued that the political opposition needs to “celebrate prosperity, not deny or trivialize it.” He has a point. We are in the midst of a historic expansion, with record low unemployment, rising wages and low inflation. That is something to celebrate.

However, what Stephens failed to address is that, in the midst of all this prosperity, there is considerable anxiety about the economy. Some of this anxiety can be quantified in statistics like rising income inequality, national debt, credit card debt and the fact that 40% of Americans can’t cover a $400 emergency expense.

Yet the problems go deeper than that. As I explained in Barron’s, our economy can be thought of as having two key activities, capacity building and maximizing that capacity to produce output. While we are clear producing a lot of output, there are worrying signs that we are not building capacity the way we should. We need to right the ship before it’s too late.

By 1941, it had become clear that the war raging in Europe would soon envelop the US. With this in mind, Vannevar Bush went to President Roosevelt with a visionary idea — to mobilize the nation’s newfound scientific prowess for the war effort. Roosevelt agreed and signed an executive order that would create the Office of Scientific Research and Development (OSRD).

With little time to go around the country hiring scientists and building labs, Bush conceived an innovative model for the OSRD. It was to offer grants to private institutions and research universities, which would cut down on bureaucracy and allow projects to start right away. The OSRD proved to be enormous successful, with inventions that included the atomic bomb, proximity fuze and radar, it greatly influenced the outcome of the war.

As the war was winding down, Roosevelt asked Bush to write a report about how America could continue the success of the OSRD in peacetime. That report, titled Science, The Endless Frontier, would set the stage for America’s technological dominance during the second half of the century.

The crux of his argument is that such efforts build capacity through creating what he called “scientific capital” and pointed out that “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science.”

Bush’s report led to the development of America’s scientific infrastructure, including agencies such as The National Science Foundation (NSF), National Institutes of Health (NIH) and DARPA, that powered our commercial dominance throughout the 20th century. Look inside an iPhone or any other major technology product today and you will find that most, if not all, of the crucial components started out in some government lab.

Bush’s vision is, if anything, even more important today than it was when he conceived it in 1945. Unlike in the aftermath of World War II, today we have stiff global competition, most notably from China. In order to power prosperity for the next century, we need new discoveries that private enterprise can turn into new products and new business models.

To take just one example, it has been estimated that the $3.8 billion invested in the Human Genome Project resulted in nearly $800 billion in economic impact and created over 300,000 jobs in just its first decade. Today, that initial investment is unlocking a true genomics revolution in which we’re shifting from just reading genomes to writing them.

What’s crucial to take into account is that it usually takes about 30 years to go from an initial discovery in the lab to a significant economic impact in the real world. That means that, although the scientific capital we build today does not show up in today’s economic statistics, it will have a significant impact on our ability to compete for generations to come, much as the wise investments made in the post-war years helped set the stage for future prosperity.

So it is worrying, to say the least, that a recent report by the Union of Concerned Scientists details how science is currently being undermined. Decisions about funding are being politicized, important studies are being suppressed and scientific experts are being sidelined in favor of political appointees.

While the public funding of basic science is what’s Bush’s report is best known for, he also argued vigorously for building human capital. “If ability, and not the circumstance of family fortune, is made to determine who shall receive higher education in science, then we shall be assured of constantly improving quality at every level of scientific activity,” he wrote.

This commitment to education helped make America the most educated country in the world. More recently, however, we’ve fallen to fifth among OECD countries for post-secondary education. This hasn’t been because less Americans are going to college, in fact, more people go to college today than in 2000. It’s just that the rest of the world is moving faster than we are.

A big factor in our decline has been tuition, which has risen from an average of $15,160 in 1988 to 34,740 in 2018. Not surprisingly, student debt is exploding. It has nearly tripled in the last decade. In fact student debt has become so onerous that it now takes about 20 years to pay off four years for college and even more to pursue a graduate degree.

Yet it’s not just tuition and loans that are undermining our human capital. A recent study found that about half of all college students struggle with food insecurity and a group of community college administrators told me that food insecurity was the number one reason why their students drop out.

How can we “be assured of constantly improving quality at every level of scientific activity,” if our most ambitious young people are impoverished, hungry and then subjected to decades of what is essentially indentured servitude?

The 20th century was, to a large extent, a battle between state socialism and free market capitalism. Socialism lost and it’s not hard to see why. When you take away the market mechanism, you lose important signals that drive productive activity. I spent 15 years of my adult life in post-Soviet space and I can attest to how poorly that turned out.

However, while socialism lost the Cold War, it has clearly one the Internet. Everywhere you look, open source communities are powering modern society. From basic technologies like Apache and MySQL to highly specialized ones like Spark and Cloud Foundry. Tesla, quite famously, has open sourced its patents for electric cars. It’s not a stretch to say that proprietary technology these days is really the stuff we build on top of open source.

The reason why open source technologies have been so successful is that while the profit motive is an important signal, it is not the only signal. Markets have externalities, both positive and negative, which need to be addressed. In the case of open source, it is the need for technology standards upon which to build and grow businesses that make the logic compelling.

That’s an important point that Stephens’ column fails to address. Just as the technology market can’t be reduced to simple metrics for profit and loss, the well-being of an entire society can’t be summed up with a few economic statistics. While high output and low unemployment are clearly good things, they are a pyrrhic victory at best if we mortgage our future to attain them.

Written by

Bestselling Author of Cascades and Mapping Innovation, @HBR Contributor, - Learn more at www.GregSatell.com — note: I use Amazon Affiliate links for books.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store