History of Medicine
May 2008

The U.S. Health Care Non-System, 1908-2008

George B. Moseley III, JD, MBA
Virtual Mentor. 2008;10(5):324-331. doi: 10.1001/virtualmentor.2008.10.5.mhst1-0805.

 

One hundred years ago, in 1908, health care was virtually unregulated and health insurance, nonexistent. Physicians practiced and treated patients in their homes. The few hospitals that existed provided minimal therapeutic care. Both physicians and hospitals were unregulated. When patients saw a physician, they paid their modest fees out-of-pocket; they were more concerned about the wages they would lose if illness kept them out of work than about the cost of their medical care.

Medical science and technology were primitive, and there was little that physicians could do to treat most illnesses. It had been only 40-50 years since the first understanding of bacteriology, antisepsis, and immunology; 21 years since the invention of a blood pressure measurement device; and 13 years since the discovery of X-ray technology. It would not be until 1910 that the first drug treatment to destroy disease—and not the patient—would emerge or that surgery would become common for conditions like tumors, infected tonsils, and appendicitis.

Commercial insurance companies did not write health insurance policies in 1908; they saw no way to avoid the risks of adverse selection (those who were sick would seek coverage, and those who were healthy would not) and moral hazard (coverage would encourage the insured to seek unnecessary services), and they lacked the means to calculate risks accurately and set appropriate premiums. Within the next 10 years, many European nations would adopt some form of compulsory national health insurance, but similar proposals in the U.S. were rejected because of lack of interest and resistance from physicians and commercial insurers [1].

Yet it was in the early 1900s that regulation and organization of health professions began to take hold. Membership in the American Medical Association (AMA) increased from 8,000 in 1900 to 70,000 in 1910 [2]. In 1904, the AMA formed the Council on Medical Education to establish physician licensure standards. The 1910 Flexner Report on medical education recommended stricter entrance requirements, better facilities, higher fees, and tougher standards for medical students [3]. By 1920, the cultural influence of the medical profession was growing as physicians' incomes and prestige increased.

During the 1920s, the cost of medical care rose due to growing demand and higher quality standards for physicians and hospitals. Families had more money to spend but less room in their homes to care for sick family members. Advances in medical technology, tougher licensing criteria, and the growing acceptance of medicine as a science led to the emergence of hospitals as credible centers for treatment. They were now modern scientific institutions that valued antiseptics and cleanliness and used medications for the relief of pain. When the American College of Surgeons was founded in 1913, it was the first body to accredit hospitals [4]. Of the 692 hospitals examined in 1918, only 13 percent received accreditation. By 1932, the percentage had grown to 93 percent of the 1,600 hospitals surveyed [5]. In 1929, the average American family had medical expenses of about $103—roughly 5 percent of the average annual income of $1,916. Typically 14 percent of these expenses were for hospital care [6].

In 1929, a group of Dallas school teachers contracted with Baylor University Hospital to receive up to 21 days of inpatient care a year for regular monthly payments of 50 cents [7]. Similar prepaid service plans, many involving more than one hospital, were formed during the Depression years. While they gave consumers an affordable way to pay for inpatient care, their primary purpose was to assure hospitals a steady income stream during a period of declining revenues. By 1937, there were 26 such plans with more than 600,000 members total. These combined under the auspices of the American Hospital Association (AHA) to form the Blue Cross network of plans, the first of which had been established in 1932 in Sacramento. The creation of these plans was facilitated by state legislation that allowed them to organize as nonprofit corporations, enjoy tax-exempt status, and avoid the onerous insurance regulations (particularly financial reserve requirements) that applied to commercial insurers.

In the 1930s, physicians became concerned about proposals for compulsory national health insurance and the threat of insurance competition from Blue Cross [8]. Specifically, doctors worried that third-party payers would lower their incomes by restricting their ability to set their own fees. In response, physicians established a network of their own insurance plans covering physician services. These plans, known as Blue Shield, preempted the hospital-oriented Blue Cross plans from entering into the primary care sector. Meanwhile, in 1935, the Social Security Act was passed without a health insurance component.

The success of the Blue Cross and Blue Shield plans showed commercial insurers that adverse selection could be overcome by focusing on insuring groups of young, healthy, employed workers. The commercial plans also benefited from a legal advantage: as non-profit entities, the Blues had to "community rate" their policyholders, while the for-profit commercial plans (strictly regulated insurance companies) were free to engage in experience rating [9]. As a result the market for health insurance of all kinds increased dramatically during the 1940s, from a total enrollment of 20,662,000 in 1940 to 142,334,000 in 1950.

Another spur to health insurance sales came during World War II, when wage and price controls prevented employers from using higher salaries to attract workers. They were, however, allowed to offer fringe benefits like health insurance for up to 5 percent of a worker's wages [10]. In addition, the National Labor Relations Board ruled that health insurance benefits were a legitimate subject of labor-management negotiations. Lastly, the IRS determined that employers could deduct the cost of employee health benefits from taxable business income, and employees did not have to include the value of those benefits in calculating their taxable income. The role of employers as the primary source of health insurance coverage was now firmly entrenched [11].

A New Way to Pay for Health Care

The Blue Cross and Blue Shield plans used a reimbursement methodology called "cost plus." In this payment scheme, physicians were compensated according to "reasonable and customary charges" that they themselves set, and hospitals were reimbursed on a percentage of their actual costs plus a percentage of their working and equity capital. This allowed doctors to charge whatever they wanted and encouraged hospitals to increase costs so their cost-based income would be greater. This methodology was replicated by commercial insurers and the subsequent government health insurance programs, Medicare and Medicaid.

As hospitals became the center of medical care delivery, it became apparent that many communities lacked adequate access to them. The Hill-Burton Act was passed in 1946 to provide loans and grants for the construction of new hospitals and improvements in the physical plants of existing ones [12].

Over the years many legislative proposals for different approaches to health insurance were introduced and failed. In 1944 President Roosevelt asked Congress for an "Economic Bill of Rights" that included a right to adequate medical care, but this request was never fulfilled. President Truman proposed a national health insurance program that would have created a system covering all Americans, but it was denounced by the AMA and called a "communist plot" by members of Congress [13]. By 1950, national health care expenditures equaled 4.5 percent of the GNP (gross national product) and were continuing to rise [14].

During the 1950s, the price of hospital care doubled, and medical breakthroughs were coming at a fast pace. Medications became available to treat infections and conditions like glaucoma and arthritis, and new vaccines were developed to prevent childhood diseases like polio. The first successful organ transplant was performed in 1954.

Entering the 1960s, the health care system was fiscally unrestrained. There were no external controls on the cost of medical therapies delivered or the resources consumed. There were, by then, more than 700 companies selling health insurance, yet people who were unemployed, like the elderly, were having difficulty paying for it. Realizing that proposals for total reform of the system were not working, advocates turned to a more incremental approach. In 1965, Congress created the Medicare and Medicaid programs to provide health care coverage to the elderly and poor [15]. Overnight the federal government became the largest single purchaser of health care services, but these two public programs adopted the same reimbursement defects that were found in the private health insurance industry, accelerating the rate of health care price inflation.

During this same period, there was concern about a doctor shortage and the need for additional manpower in other health professions. One result was the enactment of the Health Professions Educational Assistance Act of 1963, which provided direct financial assistance to medical, dental, nursing, pharmacy, and other health professional schools and their students [16].

The Advent of HMOs and Other Payment Plans

In 1929, the Ross-Loos Medical Group had established a prepaid health plan that provided medical services to Los Angeles city and county employees for $1.50 a month [17]. In retrospect, this is considered to be the first HMO (health maintenance organization). In 1945, the Kaiser Foundation Health Plan was founded to provide prepaid health benefits to workers in Kaiser shipyards; it has come to be viewed as a model for HMOs. Yet from 1945 until the 1970s, these plans, which combined the financing and delivery functions of health care, were idiosyncratic players in the health care market.

In 1970, Paul Elwood coined the phrase "health maintenance organization" to emphasize the clinical prevention role of plans like Kaiser's [18]. At a time of soaring health care costs, it was noticed that HMOs were able to reduce resource utilization rates, particularly hospital admissions and lengths of stay. The Health Maintenance Organization Act of 1973 was passed to encourage HMO growth in the marketplace [19]. This law provided grants and loans to start or expand HMOs, removed state restrictions on federally certified HMOs, and required employers of 25 or more employees to offer this type of plan as a benefit option in addition to indemnity (or fee-for-service) plans. In the 1970s there were 26 plans with about 3 million subscribers nationwide; by 1991 the numbers had grown to 556 plans with 35 million enrollees.

In 1983, Medicare instituted a prospective payment system (PPS) for reimbursing hospitals [20]. It paid hospitals for services on the basis of 475 diagnosis-related groups (DRGs) of illnesses. Like most price control systems, the PPS caused hospitals to shift the patient cost burden to activities not covered by the controls. In 1992, the system for calculating reimbursements to physicians for services covered by Medicare was switched to one based on the cost of resources consumed in delivering a particular clinical service.

During the late 1980s and early 1990s, health spending increased at an even more rapid pace. This has been attributed to expensive new medical technologies (estimated to account for an average of one-third of annual cost increases) and the curtailing of the ambitious HMO-promoting programs of the 1970s. Another attempt at national health care reform was made in 1993 through the failed Clinton "managed competition" proposal.

Traditional HMO and fiscal management practices, such as gatekeeping, capitation reimbursement, utilization review, clinical practice guidelines, and selective physician contracting [21], lumped under the term "managed care," strengthened the power of the health care organizations that used them. Under these constraints, the growth in health care spending slowed noticeably in the mid-1990s, but the constraints provoked resistance from patients and physicians, who saw treatment decisions being taken from their hands and their clinical judgment being second-guessed.

All payers, private and public, gradually backed away from some of their more severe managed care policies (like capitation and physician choice limits) but have not replaced them with anything more effective in controlling costs. Not surprisingly, health care cost inflation picked up again in the late 1990s.

Controlling Costs in the 21st Century

The current strategy for addressing the spending problems within the U.S. health care system is to introduce changes that will make it function more like a traditional "perfect market." This is based on the assumption that health care should be treated as a private consumable product rather than a public good. These changes are wrapped up in the "consumer-driven health care" movement. All consumers, including those under employer-based health plans, will assume greater responsibility for making decisions about many aspects of their health care: how much of their own money to spend on it, the type of insurance protection to buy, which providers (physicians and hospitals) to use, and what specific clinical procedures to receive. This initiative should be combined with greater transparency about the cost, quality, and other features of health care providers and products, much of it gathered through comprehensive electronic medical record and information systems.

There are no active proposals at the federal level for resolving the lack of access to health care experienced by 45 million uninsured Americans, 15 percent of the population. Ambitious efforts at universal coverage have been launched by a few individual states, namely Massachusetts, Maine, Hawaii, and California. Time will show the success of their approaches. Encouragingly, physician attitudes towards national health insurance have evolved to the point that, in April 2008, 59 percent of them supported legislation to create such a program [22]. Certainly the next U.S. president and Congress will be under pressure to give greater attention to many aspects of the health care delivery and financing systems.

For the moment, the U.S. continues to spend 50 percent more on health care as measured by its share of the GDP (gross domestic product), than any other developed country. In 2006, health care spending accounted for over 16 percent of the U.S. GDP [23]. At the same time, life expectancies are lower and infant mortality rates higher in the U.S. than in most of those other developed countries. The success of various approaches to systemic health care reform thus remains to be established.

References

  1. American Medical Association. Minutes of the House of Delegates. JAMA. 1920;74:1317-1328.

  2. Garceau O. The Political Life of the American Medical Association. Cambridge, MA: Harvard University Press; 1941:132.

  3. Flexner A. Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching.New York, NY: Foundation for the Advancement of Teaching; 1910.

  4. Affeldt JE. Voluntary accreditation. Proc Acad Polit Sci. 1980;33(4):182-191.
  5. Shryock RH. The Development of Modern Medicine. Madison, WI: University of Wisconsin Press; 1979:348.

  6. Falk IS, Rorem CR, Ring MD. The Cost of Medical Care. Chicago, IL: University of Chicago Press; 1933:89.

  7. Beazley S. Eight Decades of Health Care. Chicago, IL: Hospital and Health Networks; 2007. Accessed April 10, 2008.

  8. Leland RG. Prepayment plans for hospital care. JAMA. 1933;100:113-117.

  9. Under community rating, an insurer charges the same premium to all policyholders in a particular group, without regard to any demographic characteristics or indicators of health status. Under experience rating, the insurer takes into account each individual policyholder's health status, prior experience with utilization of health care resources, or any other factors that might indicate their likelihood of requiring medical care and using their health insurance coverage.

  10. Klein J. For All These Rights: Business, Labor, and the Shaping of America's Public-Private Welfare State. Princeton, NJ: Princeton University Press; 2003:204-257.

  11. Scofea LA. The development and growth of employer-provided health insurance. Mon Labor Rev. 1994;117(3):3-10.
  12. Brinker PA, Burley W. The Hill-Burton Act: 1948-1954. Rev Econ Stat. 1962;44(2):208-212.
  13. Quadango J. Why the United States has no national health insurance: stakeholder mobilization against the welfare state. J Health Soc Behav. 2004;45Suppl:25-44.

  14. Kristein MM, Arnold CB, Wynder FL. Health economics and preventive care. Science. 1977;195(4277):457-462.
  15. Centers for Medicare and Medicaid. Key Milestones in CMS Programshttp://www.cms.hhs.gov/History/Downloads/CMSProgramKeyMilestones.pdf. Accessed April 10, 2008.

  16. Stevens R. American Medicine and the Public Interest. Los Angeles, CA: University of California Press; 1998.

  17. Field MJ, Gray BH, eds. Controlling Costs and Changing Patient Care? The Role of Utilization Management. Washington, DC: National Academies Press; 1989.

  18. Ellwood PM Jr, Anderson NN, Billings JE, Carlson RJ, Hoagberg EJ, McClure W. Health maintenance strategy. Med Care. 1971;9(3):291-298.
  19. Uylhara EE, Thomas MA. Health Maintenance Organization and the HMO Act of 1973. Document number P-5554. Washington, DC: RAND Corporation; 1975.

  20. Office of the Inspector General. Medicare Hospital Prospective Payment System, How DRG Rates Are Calculated and Updated. San Francisco, CA: Office of Inspector General, Office of Evaluation and Inspections, Region IX; 2001. http://oig.hhs.gov/oei/reports/oei-09-00-00200.pdf. Accessed April 10, 2008.

  21. Gatekeeping is a requirement that some health plans impose on their members. When first entering a plan facility with a medical problem, the member sees a "gatekeeper," typically a primary care physician or nurse practitioner, who assesses the problem and determines what additional services are called for. The gatekeeper may provide some care herself, make referrals to specialists, coordinate among numerous caregivers treating the member, and oversee the total program of care. The purpose of the gatekeeper is to ensure that the member is treated in the most expeditious manner possible without the excessive utilization of resources. Utilization review encompasses a variety of mechanisms designed to assure that the resources consumed in treating patients, primarily by physicians, are medically necessary. Under this scheme, physicians are required to get a health plan's permission before admitting a patient to a hospital, and to obtain further permission to keep the patient in the hospital beyond a predetermined length of stay. In some cases, if the plan later concludes that a treatment or service was not warranted, it will deny reimbursement. Capitation reimbursement was introduced in response to physician resentment at nonphysician micromanagement of clinical decision making. Physicians received advance payments of a fixed monthly amount per member under their ongoing care, whether a particular member was actually provided services in a given month or not. As long as the physician was responsible for a large enough pool of members, the total capitation payments (properly calculated) usually sufficed to cover the costs of treating the very few seriously ill and the modest number of more moderately ill among them. Proponents of capitation argue that it is an incentive for physicians to treat patients as efficiently as possible, using their own clinical judgment. Critics claim that it can result in the undertreatment of patients.

  22. Survey: More doctors now support national health insurance [press release]. Indianapolis, IN: Indiana University School of Medicine; April 1, 2008.

  23. Catlin A, Cowan C, Hartman M, Heffler S; the National Health Expenditure Accounts Team. National health spending in 2006: a year of change for prescription drugs. Health Aff. 2008;27(1):14-29. http://content.healthaffairs.org/cgi/content/full/27/1/14. Accessed April 10, 2008.

Citation

Virtual Mentor. 2008;10(5):324-331.

DOI

10.1001/virtualmentor.2008.10.5.mhst1-0805.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.