In the spring of 2016, then-candidate Donald Trump promised Americans he would eliminate the entire $19 trillion national debt of the United States “say over a period of eight years.” His pledge was ludicrous on its face. With the CBO already forecasting $6.7 trillion in new debt and Trump proposing a tax plan that would drain up to $8 trillion more from the U.S. Treasury during the same time period, the Donald would have had to slash total federal spending by roughly 90 percent over two terms in office. By statute, it can’t be done (as most federal spending is mandatory), but would be cataclysmic if it could. It’s no wonder Trump’s own budget chief Mick Mulvaney called his boss’ debt elimination guarantee “hyperbole.”
This week, the nonpartisan Congressional Budget Office (CBO) confirmed that reality does indeed have a liberal bias. After annual budget deficits shrank by two-thirds during President Obama’s tenure, CBO warned that the Trump administration will return to trillion-dollar deficits by 2020. In the latest CBO rejection of the GOP myth that “tax cuts pay for themselves,” Capitol Hill’s budget scorekeeper forecast that the Republicans’ “Tax Cuts and Jobs Act” will result in $2.3 trillion in new red ink over the next decade. All told, by 2028 Uncle Sam will pile on an additional $13.2 trillion in new deficit spending, with the national debt as a share of the American economy hitting levels (96 percent) not seen since World War II.
In his statement released with the “The 2018 Long-Term Budget Outlook” released on June 26, the GOP’s hand-picked CBO Director Keith Hall explained there is no mystery to the accelerating national debt. “Compared with last year’s projections, CBO’s current projections of debt as a share of GDP are higher through 2041.” Why?
The increase in debt through 2041 stems primarily from tax and spending legislation enacted since then that boosted projected deficits through 2025—especially the 2017 tax act.
But even that picture is worse than it appears:
If lawmakers changed current law to maintain certain policies now in place—preventing a significant increase in individual income taxes in 2026, for example—the result would be even larger increases in debt. The prospect of large and growing debt poses substantial risks for the nation and presents policymakers with significant challenges.
Significant challenges for policymakers, indeed. So, here’s a one-sentence cheat sheet to help simplify the problem for our elected officials in Washington: It’s the growing cost of health care, stupid! The solution can be stated pretty simply as well. At long last, the United States must join the vast majority of its economic competitors by having the government set the prices for drugs, tests, doctor’s visits, hospitalization, surgical procedures, and just about every aspect of health care.
As the chart above suggests, it is the exploding cost of health care (and the concomitant growth in interest payments on the national debt) that will unleash America’s future hemorrhage of red ink. Alongside raising tax revenue, dramatically slowing the rate of growth of medical spending is the single most important—and most difficult—policy objective the President and Congress must achieve. As with Social Security, our aging population only explains part of the problem.
Before digging into why, it’s worth pointing out that it’s long past time for Americans to pay for the government they’ve said they want. For the past 50 years, federal spending as a percentage of gross domestic product (GDP) has averaged around 21 percent. Tax revenues, in contrast, have averaged only 17 percent. Since 1960, the only times Uncle Sam has produced balanced budgets or surpluses has been when revenues hit 20 percent of GDP. In recent years, polling has consistently shown that the only area where a majority of Americans wants to slash spending is on foreign aid, which happens to be only about 1 percent of all federal spending. Going forward, the U.S. Treasury will still be comparatively deprived of tax revenues even as spending mushrooms to 25 percent of GDP between 2028 and 2038, and 27.9 percent in the decade after.
To be sure, the budget squeeze could be minimized if the economy were to grow much faster and so fill Uncle Sam’s coffers with extra revenue from the explosion of new business. That’s why Donald Trump has repeatedly promised GDP growth as high as six percent. But the U.S. hasn’t averaged as high as 4 percent growth since the post-World War II period of 1950 to 1973. From 2002 to 2007, that rate dropped to 2.4 percent and in the aftermath of the Bush recession that began in late 2007, plummeted to just 0.9 percent over the ensuing decade. That’s why CBO is forecasting meager growth of 1.9 percent going forward, even as Trump water carriers like Ohio Senator Rob Portman believe GDP growth will magically grow faster. (The numbers on the dismal growth of the labor force, by the way, make a powerful case for expanding rather than constricting immigration.)
This week’s CBO assessment sums up the predicament this way:
The federal government’s net interest costs are projected to climb sharply as interest rates rise from their currently low levels and as debt accumulates. Such spending would about equal spending for Social Security, currently the largest federal program, by the end of the projection period.
Noninterest spending is projected to rise from 19 percent of GDP in 2018 to 23 percent in 2048, mainly because of increases in spending for Social Security and the major health care programs (primarily Medicare). Much of the spending growth for Social Security and Medicare results from the aging of the population. Growth in spending for Medicare and the other major health care programs is also driven by rising health care costs per person. [Emphasis in original.]
As the chart at the top suggests, to keep the federal government’s future budget deficits to manageable levels, policymakers must focus on the health care spending that will nearly double as a share of the economy (from 5.2 to 9.2 percent) over the next three decades. Interest payments on the national debt are mandatory, and so must be paid. Discretionary spending (including defense outlays) is only about 30 percent of the budget. Nondefense discretionary spending—that is, things like education funding, transportation, spending, R&D, food stamps and infrastructure investment—is already at its lowest level as a share of the U.S. economy since 1950. (Defense spending, which won big increases in the current fiscal year, shouldn’t be immune from the scalpel.) That leaves the big mandatory spending programs, like Social Security, Medicare, Medicaid, and Affordable Care Act subsidies.
But the population over age 65 continuing to rise in both absolute and relative terms, now is no time to cut eligibility and benefits for America’s popular pension and health care program for senior citizens. (House Republicans seek to do exactly that to the tune of $1.5 trillion over the next decade.) “By 2048,” CBO explained, “22 percent of the population would be age 65 or older, compared with 16 percent today.” Any shortfall in the Social Security Trust Fund can be addressed by simply raising the salary cap to which the payroll tax applies to the traditional 90 percent of income level or eliminating it altogether. The challenge, as CBO shows in the chart below, is the “excess cost growth” exceed the rate of economic growth.
And just how do we crack that nut? As Michael Hiltzik correctly summed it up when Jeff Bezos, Warren Buffett, and Jamie Dimon announced their much-hyped health care initiative in January:
Reducing healthcare costs doesn't require Bezos/Buffett/Dimon magic: Every other country already knows how.
Yes, with all due respect to the Oracle of Omaha, there is already an answer to this problem, one which America’s economic competitors discovered years ago and still share today. Whether in the nationalized system of the U.K., the single-payer systems of Canada’s provinces, the mandated health savings accounts in Singapore, or the universal coverage regimes nevertheless dependent on private insurers in France, Germany, Switzerland, and Japan, the solution for cost control and price transparency is the same. Whether negotiated directly or through a national association of insurers, the government sets the prices for prescription drugs, tests, treatments, hospital stays, and pretty much everything else.
It is precisely this “all-payer rate setting” (and not single-payer) which unites the health care systems of the leading “developed” nations. Conversely, its absence is what makes American health care sadly exceptional. As Austin Frakt and Aaron E. Carroll recently summed it up (“Why the U.S. Spends So Much More Than Other Nations on Health Care”) in the New York Times, “Studies point to a simple reason, the prices, not to the amount of care.”
The United States spends almost twice as much on health care, as a percentage of its economy, as other advanced industrialized countries — totaling $3.3 trillion, or 17.9 percent of gross domestic product in 2016.
But a few decades ago American health care spending was much closer to that of peer nations.
What happened?
A large part of the answer can be found in the title of a 2003 paper in Health Affairs by the Princeton University health economist Uwe Reinhardt: “It’s the prices, stupid.”
(For more on that theme, see Sarah Kliff here.)
As Frakt and Carroll go on to explain, subsequent studies published by the Journal of the American Medical Association (JAMA) and other sources similarly show that it is uniquely high U.S. health care prices, and not the amount of care Americans consume or have delivered to them, that sets our nation apart:
Though the JAMA study could not separate care intensity and price, other research blames prices more. For example, one study found that the spending growth for treating patients between 2003 and 2007 is almost entirely because of a growth in prices, with little contribution from growth in the quantity of treatment services provided. Another study found that U.S. hospital prices are 60 percent higher than those in Europe. Other studies also point to prices as a major factor in American health care spending growth.
During their year-long effort to repeal the Affordable Care Act in 2017, Republicans turned to free-market mythology to explain away the special pain meted out to American health care “consumers.” House Freedom Caucus chairman Jim Jordan of Ohio argued that Obamacare supporters needed to talk to Adam Smith’s invisible hand:
“The point is we don't have a free market. I think Americans have forgotten what a free market looks like in health care…I do know that every other industry, every other area where you have markets, when you can shop for price and shop for value, prices come down over time. That's what we'd like to see more of in health care.”
But when Jeff Scott of Vox (like Ali Velshi of MSNBC before him) asked Jordan “is there a state or a country with the kind of health care system you're talking about that we should be trying to emulate here?” the Ohio Republican could manage only, “I’ve not seen that.”
Jordan hasn’t seen a successful “free market health care system” because it doesn’t exist. As Dr. Paul Krugman diagnosed the problem with Republicans’ vision in July 2009:
There are a number of successful health-care systems, at least as measured by pretty good care much cheaper than here, and they are quite different from each other. There are, however, no examples of successful health care based on the principles of the free market, for one simple reason: in health care, the free market just doesn't work. And people who say that the market is the answer are flying in the face of both theory and overwhelming evidence.
The theory explaining the failure of free market health care isn’t rocket science. Let's start with the conservative free-market nirvana where buyer and seller, each armed with perfect information, come together in a voluntary transaction. But from the get-go, the patient-as-consumer faces a knowledge asymmetry almost impossible to overcome. Americans' general deference to physicians isn't just a cultural trait, it simply reflects the expertise and training regarding diagnoses, possible treatments, and likely outcomes doctors possess and their patients do not. For some cases and for some conditions, the layman can narrow that yawning information gap. But WebMD or no, it can't be eliminated. "Health" is not a commodity. Those who believe that choosing a health care product or service is no different than buying a car, television, or cell phone might feel differently after, say, developing colon cancer.
But even if the diagnoses, treatments, and cures for heart disease, diabetes, or depression could be purchased in a free market, in the United States the buyer simply doesn't—or can't—know what price he or she will pay. As Stephen Brill documented in March 2013 ("
Bitter Pill: Why Medical Bills Are Killing Us"), hospital prices for drugs, supplies, and procedures are completely opaque. The answer from the so-called "charge master" about what anything costs depends on whether the patient is insured or uninsured (the latter often forced to pay multiple times more than the former) and who the insurer is. As it turns out, that mystery pricing is one of the hallmarks of the American model that spends more than $3 trillion a year (over 17 percent of GDP) on health care, more than Japan, Germany, France, China, the U.K., Italy, Canada, Brazil, Spain, and Australia combined.
So whether we're discussing colonoscopies, hip replacements, asthma inhalers, or ER visits, the only certainty is that the cost to Americans will be higher—sometimes orders of magnitude higher—than those faced by the citizens in just about any other major national economy.
American exceptionalism in health care is, as Sarah Kliff summed it up, “that the federal government does not regulate the prices that health-care providers can charge.”
During the 2016 campaign, Bernie Sanders’ Medicare-for-all plan (a single payer government insurance system with fixed payment rates for providers which also eliminates deductibles and co-pays) or Hillary Clinton’s “Obamacare Plus” proposals (which extends the ACA framework in place while targeting high prescription prices and growing out-of-pocket costs with additional tax credits for consumers and tighter regulation of insurers and pharmaceutical firms) each addressed the question of “Who pays?” But as Matthew Yglesias pointed out, neither clearly answered the question of “How much?”
The thing about saving money by having a single health care payer squeeze providers on reimbursement rates is that adopting a single-payer structure is neither necessary nor sufficient to achieve the gains. In other words, if the American political system wanted to cut doctors' payments, we could do that without moving to a single-payer system. Conversely, adopting a single-payer system does not on its own lead to low reimbursement rates -- that's a separate decision that the political system would have to make.
The term for regulating the fees charged by doctors, hospitals, and others in a multi-payer setting is called all-payer rate setting, and it's a pretty good idea.
As Kliff rightly highlighted, "France, Germany, Japan, the Netherlands, and Switzerland all use some version of all-payer rate setting." Even with hundreds or thousands of private insurance plans, since 1980 all five countries have experienced much slower growth in healthcare spending than the United States (see chart above). All-payer rate setting is a powerful reason why:
In all-payer rate setting, all of the insurers negotiate jointly with all of the health care providers, and set on one specific price for each procedure...Single-payer health care systems save money in two ways: reducing administrative costs and increasing the bargaining power of health insurers. This is true of all-payer rate setting systems, too.
And to be sure, that rate-setting extends to prescription drug prices as well. As Kliff, Austin Frakt, and Zeke Emmanuel all detailed during the EpiPen and Daraprim pricing scandals, the U.K., Spain, Italy, Germany, the Netherlands, and Australia all use some form of “reference pricing.” These and other nations limit the kind of stratospheric price increases experienced in the United States by evaluating both the cost-effectiveness and efficacy of new drugs in setting prices. As Kliff lamented at the time:
“EpiPen's 400 percent price hike tells us a lot about what's wrong with American health care. Forget the $500 EpiPen. The era of the $1,000 pill is already upon us."
In contrast, the Republican vision for health care is a return to the Hobbesian struggle of each against all in the American health care ecosystem. And that means more pain for consumers, insured and uninsured alike. Or as Vox founder Ezra Klein put it in March:
“American health care can be free market or cheap. It can’t be both.”
Trends already underway in the American health care market will only exacerbate these pressures. For starters, the U.S. population is aging and over the next 25 years, the growth of Medicare will be the largest problem area for the federal budget. Expensive drugs for Hepatitis C and other diseases, new cancer cures, and the development of individual genetic therapies are likely to present growing cost challenges in the future. "We're paying too much for prescription drugs," Ezekiel Emanuel warned last year, noting that cancer drugs like Yervoy, Opdivo, and Keytruda routinely cost more than $120,000 a year, while Kalydeco for cystic fibrosis reaches $300,000 annually. Cerezyme for Gaucher disease runs about $300,000 per year—for life. "Despite representing about 1 percent of prescriptions in 2014," Emanuel notes, "these types of high-cost drugs accounted for some 32 percent of all spending on pharmaceuticals. As the Washington Post reported on January 3, new genetic cures for rare diseases are already projected to cost almost $1 million per patient. (Donald Trump’s supposed plan to contain drug costs will do nothing of the kind.)
Making matters worse, mergers among insurers and hospitals have increased at the same time that both have been busily acquiring physician practices. All the while, the erosion of employer-provided health insurance, cost-shifting to workers, and the rise of the so-called "Gig Economy"—the very factors that helped fuel the drive for Obamacare in the first place—mean families will still face the growing burden of health care costs themselves.
In the future, the United States simply must spread that burden across the entire health care ecosystem. With health care spending already crowding other social spending in the United States (see chart above), other constituents in the American health care marketplace must share the pain.
Economics, after all, is the study of the allocation of scarce resources. The economics of health care is certainly no exception. Given the competing and often contradictory demands across its ecosystem of patients, employers, physicians, drug stores, pharmaceutical firms, device manufacturers, clinics, hospitals, insurers, and government, the economics of health care might more accurately be described as the allocation of pain. In the face of the infinite “wants” for healthy citizens, financially secure families, well-compensated practitioners, and strong profits for private companies of all stripes, societies must choose how and why to distribute discomfort and dissatisfaction to some or all of the constituents.
Microsoft, Google, Apple, Amazon, J.P. Morgan, and Berkshire Hathaway notwithstanding, new technology, platforms, and systems can drive only a small part in the reduction of health care from its current 18 percent of U.S. GDP to, say, 13 or 14 percent. The inescapable answer from decades of experience around the world since the end of World War II is that, in one form or another, government must set the rates for care. Insurers, hospitals, doctors, pharmaceutical companies, and every other segment of the health care food chain will have to make less. (In exchange for the reduction in their lifetime incomes, American doctors like many of their colleagues in Europe should have their medical education paid for, with a pro-rated system of compensation for those who more recently took on that debt burden.) All-payer rate-setting solves the twin problems of transparent and lower prices. The structure of the insurance system—whether single-payer, private, or some public/private hybrid—is less critical. To put it another way, who pays is less important than how much. At the end of the day, these are political questions, not algorithmic ones. And the best part is we already know the answers.
Now, there is no shortage of proposals to provide universal health care coverage in the United States. Medicare for All, MidLife Medicare, and Medicaid for All are just a few, while the “Medicare X” blueprint from Democratic Sens. Michael Bennet of Colorado and Tim Kaine of Virginia would let individuals and, eventually businesses, buy into a public health insurance option. My own preference would be to combine Medicare with elements of the German and Swiss systems and extend it to all Americans (with the possible exception of the military). Individuals and employers would could obtain 80/20 “Medicare Standard” coverage for the same price through the government or a private insurer. (For patients and providers, this would create a de facto single payer system with all of the administrative savings that enables, even though it would be financed and distributed through multiple sources.) Private insurers could then sell for-profit “Medicare Plus” plans to defray the deductible and out-of-pocket costs of the Standard package, provide coverage for new drugs or genetic treatments, offer coverage of private rooms, and other non-standard options. (Note that House Speaker Paul Ryan has proposed something very similar, but only for those over age 65.)
Once in place, the U.S. could switch the financing of its health care system from primarily employer-based to taxpayer-funded. But none of it is affordable for the nation—until and unless government rate-setting is in place.