Crossposted from the Breakthrough Institute
The good news: an elite consensus is crystallizing around the need for massive economic stimulus funded by deficit spending. Hundreds of economists are calling for stimulus on the scale of 2-3 percent of GDP -- or $300-500 billion per year, equivalent to the expected decline in U.S. consumption as a result of the housing market collapse -- to confront the recession head-on.
The bad news: this growing consensus may only support short-term stimulus investments - such as aid to state and local governments, extended unemployment benefits, and rebate checks - without any long-term economic strategy. Infrastructure spending is gaining support, but mostly for proposals that have already been planned and scheduled. Given the increasingly dim prospects for long-term U.S. competitiveness, it's critical that we think smart and act quickly to secure our economic future.
As Harvard Business School guru Michael Porter put it in last week's BusinessWeek cover story:
The stark truth is that the U.S. has no long-term economic strategy--no coherent set of policies to ensure competitiveness over the long haul. Strategy embodies clear priorities, based on understanding the strengths we need to preserve and the weaknesses that threaten our prosperity the most. Strategy addresses what to do, but also what not to do. In dealing with a crisis, experience teaches us that steps to address the immediate problem must support a long-term strategy. Yet it is far from clear that we are taking the steps most important to America's long-term economic prosperity.
So what are some guidelines for a long-term growth strategy? A good place to start is with a basic recognition of how economic growth occurs: growth is driven primarily by increases in productivity. Why? Because greater productivity allows us to do more work with less resources. It's that simple.
Productivity growth reflects multiple factors -- technology, management, human capital -- but the biggest factor is technology. Indeed, the dominant role of technological development in economic growth has been well-documented ever since Nobel Laureate Robert Solow published his seminal paper in 1956, "The Economic Record," which demonstrated that technological progress drove at least 80% of economic growth in the United States between 1909 to 1949. As another recent cover story of BusinessWeek concluded, "Historically, technological change has been the biggest force for productivity growth in the US." It continued:
The latest figures show that "multifactor productivity"--a category that includes technological change and other improvements in business processes--accounted for 45% of productivity gains between 1987 and 2007. "Ninety-five percent of economists agree that innovation is the most important thing for long-run growth," says [Daron] Acemoglu of MIT.
Indeed, contrary to some wisdom, the overwhelming majority of economists assert that the economic boom of 1990s was due to productivity gains from information technology -- not from a balanced budget. The Bureau of Labor Statistics has shown that productivity growth between 1995 and 2004 was more than twice the average of the previous two decades. And as Robert Samuelson put it in last week's Newsweek cover story:
Arithmetically, economic growth reflects the increases in workers' hours and their productivity--a.k.a. efficiency. From 1960 to 2005, annual U.S. economic growth averaged 3.4 percent, split almost evenly between labor-force growth (1.5 percent) and productivity gains (1.9 percent).
In other words, at least 55% of U.S. economic growth over the last half-century was due to productivity gains. But we can expect the importance of productivity to be even greater in the years ahead, because the pace of U.S. labor force growth will decrease as our population ages. Not only is our labor force growth and quality declining, but as the baby boomers retire, the elderly will require ever-larger federal spending on Social Security, Medicare, and Medicaid.
The implication is this: the single most important factor for long-term U.S. economic vitality is our productivity growth. The question we should be asking, then, is what are the best policy strategies to promote productivity? And how can productivity be best promoted in areas of national strategic importance, such as energy, infrastructure, health care, education, and national security?
Some answers are obvious, but it's only in recent years that these questions have begun to be explored in depth. The role of productivity and innovation has been recognized for years -- and some economists like Paul Romer of Stanford have pioneered the field with New Growth Theory (a.k.a. Endogenous growth theory) -- however, it has not been central to national economic policy debates. This story is told well by the Information Technology & Innovation Foundation:
Fortunately within the last decade a new theory and narrative of economic growth grounded in innovation has emerged. Known by a range of terms - "new institutional economics," "new growth economics," "evolutionary economics," "neo-Schumpertarian economics," or just plain "innovation economics" - collectively, this new economics reformulates the traditional economic growth model so that knowledge, technology, entrepreneurship, and innovation and are now positioned at the center, rather than seen as forces that operate independently.
But up to now, innovation economics, and innovation policy, has not fully been appreciated by policymakers, in large part because the dominant economic policy models advocated by most economic advisors and implicitly held by most policymakers largely ignore innovation and technology-led growth, in favor of macroeconomic issues, such as tax cuts on individuals, budget surpluses, or social spending, which at the end of the day pale in significance to innovation in driving economic growth.
The dominant economic models have been macroeconomic models that do not account for the role of innovation and productivity in economic growth, nor the government's role in supporting them. This is despite the fact that public investment in microchips and the internet -- and the education policies that developed the human capital for their creation, like the National Defense Education Act -- was the greatest contributing factor toward the global information age productivity revolution. We documented this history in "Fast, Clean, and Cheap," published in the Harvard Law & Policy Review:
Large public investments in technology innovation and infrastructure are not new. Most of America's largest industries have benefited from strategic public investment in their development: agriculture, aerospace, transport, biotechnology, and energy. Farm land was granted to early American frontier farmers, and agriculture has been publicly subsidized since the early twentieth century. Before the Civil War, Abraham Lincoln was best known for his aggressive advocacy of publicly funded transit projects intended to modernize industry: canals, roads, and later, famously, railroads. The U.S. government created computer science, aerospace, and the modern highway system through investments that were designed to compete with the Soviets and were justified by national security concerns. And today's highly mature energy markets are the result of decades of subsidies for coal mining and oil drilling.
Many of these public investments in technology and infrastructure not only served to drive productivity and economic growth, but also to increase our national security. In fact, a majority of these investments were justified for national security purposes. One of the greatest benefits of public investment, as opposed to other forms of public policy, is that spillover benefits are almost always guaranteed. In the case of public investment in R&D, most studies have shown the return rate to be somewhere in the range of 30 to 100 percent. When it comes to promoting growth in specific strategic sectors, investment is sometimes the only possible public policy tool -- after all, it would have been impossible to invent the internet if the government had tried to tax or regulate typewriters.
With economic models that don't account for this history, however, it's little wonder that deficit hawks have had so much influence over economic policy; that everyone from neoclassical economists to climate policy advocates have put far too little emphasis on technology and innovation policy; that carbon pricing and regulation have attracted more support than a New Apollo Project in clean energy; that legislation like the America COMPETES Act, despite Congressional authorization, cannot even gain enough support for appropriation; and that the total federal budget for energy R&D is less than the R&D budget for one pharmaceutical company.
Fortunately, these debates are shifting. Major deficit spending for economic stimulus is almost a given at this point, and deficit hawks seem to be in the minority; President-elect Obama has said an Apollo-like investment in clean energy will be his single top priority, in addition to stemming the economic crisis. And Al Gore, the most prominent climate spokesperson, has shifted from his longstanding focus on regulating carbon pollution to direct public investments in clean energy as the best way to deal with climate change.
Now is the moment to complete this paradigm shift. As we enter a new economic and political era, we face an extraordinary opportunity to advance long-term investments in our economic future and build a new economic governance model to drive American growth, competitiveness, and leadership in the 21st century. But it won't be easy -- antipathy to deficit spending still abounds, market fundamentalists continue to oppose any deficit spending beyond direct stimulus, and many advisors to President Obama will encourage him to be cautious.
Over the next several weeks and months, Breakthrough will be taking a deeper dive into these questions and working to shift the national economic paradigm toward the investments we need to secure America's future. Stay tuned.