Louis Johnston is a professor at College of Saint Benedicts and Saint John’s University and specializes in macroeconomics. He is a Policy Fellow at Growth & Justice who advises on economic issues.
Take out your cell phone. Look at the keyboard. Did you know that it is the same keyboard we have used for over 150 years? Why is that? Why do we continue to utilize such old-fashioned technologies?
Paul David, in “Clio and the Economics of QWERTY,” asked this question over thirty years ago. His answer: many technological standards, ranging from typewriter keyboards to railroad gauge to electrical current, did not come about because they were the optimal method for typing letters, setting railroad tracks, or delivering electricity. Rather, they were the product of a process known as path dependence, a situation in which the set-up costs of a particular technology lead to an early lock-in for a popular standard, even if there are superior alternatives later available. The basic idea is that the costs of switching technologies are higher than the benefits gained from the new alternative.
I have another name for this phenomenon: history matters. Where you start determines, in part, where you will end up. And, this idea applies to public policy just as much as to technologies. This is important to keep this in mind as we tackle tough issues such as health care.
How many systems?
Health care reform has been a perennial issue in the United States since World War II. Before Congress and President Obama enacted the Affordable Care Act (ACA), we had four health care systems:
1. YOYO: You’re on your own
2. Employer-provided insurance
4. Government-financed and government-produced
You probably recognize YOYO and employer-provided insurance, but you might argue that we don’t have single payer or government-financed and -produced health care in America. Let’s take a closer look.
Medicare, Medicaid, and the Children’s Health Insurance Program are all single-payer systems. To access this system, you need to meet age and income requirements, but once you qualify the government acts as a single-payer to cover almost all costs.
The Veterans Health Administration runs much like Britain’s National Health Service: the federal government owns the hospitals and clinics, hires the health care professionals, and all who qualify can get care at these facilities. It’s government- financed and government- produced health care.
We have these four health care systems because each piece developed organically through time and rather than sweeping away existing institutions, Americans added new ones. Our health care system looks the way it does because of path dependency; again, history matters.
Path dependence: Government-financed and -produced care
The Veterans Health Administration, for example, grew out of Abraham Lincoln’s Second Inaugural Address, when he declared:
With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds, to care for him who shall have borne the battle and for his widow and his orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations (bold added).
Theda Skocpol, in Protecting Soldiers and Mothers: The Political Origins of Social Policy in United States, traces how soldiers’ homes, pensions for Civil War veterans and their families, and a variety of other social policies evolved from the 1870s through the 1910s in the wake of Lincoln’s admonition. Gradually, Americans accepted these arrangements as part of the policy landscape and began taking them for granted.
Path dependence: Employer-provided insurance
Employer-provided health care is the result of price controls during World War II. As Melissa Thomasson of Miami University documents, limits on wages meant that employers needed alternative methods for attracting much-needed workers. Prepaid hospital plans (such as Blue Cross) and physician plans (such as Blue Shield) had already developed during the 1930s and during the war enterprising employers offered, as a fringe benefit, to pay some (or in some cases all) of these premiums for workers and their families.
After World War II, labor unions and companies negotiated contracts that included employer-provided health insurance. However, it was not clear that health insurance and other fringe benefits were deductible from corporate income taxes; wages and salaries were deductible, and employers argued that fringe benefits were simply another form of compensation and should also be deductible.
So, in the late 1940s and 1950s a patchwork system started to develop. Some employers offered health insurance for which they paid part or all of the premiums without deducting these costs from their corporate income taxes. Others companies offered health insurance, deducted the costs on their taxes, and waited to see if the IRS would allow them to do so. Many firms decided to wait and see how the IRS ruled on the issue.
In 1954, the IRS ruled that employers could deduct the full cost of employer-provided health insurance when estimating their corporate income taxes. Thomasson writes,
The tax subsidy increased the amount of health insurance demanded, and extended access to health care. By fostering an increase in the demand for group insurance relative to individual coverage, it also ensured that health insurance in the United States would evolve as a group, employment-based system (p. 1374)
Thus, early health insurance plans, a demand for war workers, and an IRS ruling all helped to shape our dominant health care sector.
Filling in the gaps
Throughout the 1940s and 1950s, there were obvious gaps in the health care system. Older Americans often lost their health insurance when they retired; low-income Americans often worked for companies that did not offer health insurance; and unemployed Americans could not afford to purchase their own insurance.
President Truman proposed a solution in 1948: a single-payer health care system for all Americans. Democratic congressmen and senators proposed variations on this idea, with a focus on senior citizens and the poor, throughout the 1950s – though none passed.
In 1965, President Johnson succeeded in enacting Medicare and Medicaid to cover seniors and low-income Americans. These single-payer systems supplemented, but did not replace, what already existed.
The Affordable Care Act and lessons for the future
President Obama followed this well-worn path of working with existing systems rather than bulldozing the whole works when he proposed what became the Affordable Care Act. The ACA sweetened the incentives for companies to provide health insurance, expanded Medicaid, and created state-level insurance markets so that individuals (and small businesses) in the YOYO system could purchase health insurance. Additionally, the ACA mandated that everyone be part of one of these systems in order to help maintain its economic solvency. The ACA did not mandate which system a person must join, only that every American had to choose one of them.
Today’s health care system, including the ACA, is the result of a path dependent process. No single entity planned or created it. However, people and institutions are now accustomed to these systems. They have made important decisions based on these particular systems being in place, for example, changing jobs or moving to another state.
Furthermore, many Amercans now see certain health standards as basic rights, such as coverage availability without regard to pre-existing conditions, keeping their children on their plans until age 26, and a variety of other benefits enacted under the ACA. It’s therefore not surprising that those who oppose the ACA and want to eliminate it are having a tough time.
In other words, to understand our health care debates, and public policy more generally, we need to start from an important premise: history matters.