Health care industry Summary

  • Last updated on November 10, 2022

Health care has become an important business, and contributions for workers’ medical insurance are a major expense for many firms.

Medicines and medical practitioners have been part of human society since ancient times. Benjamin Rush, a signer of the Declaration of Independence, was a prominent Philadelphia physician. He persuaded many people that all diseases had a common cause and could be treated by draining blood from the patient–a bizarre notion that became discredited by the 1840’s. Many years would pass before the medical profession clearly did more good than harm.Health care industry

The first hospital in the American colonies, Pennsylvania Hospital, was opened in 1751. Numerous pharmacies in the colonies marketed a wide variety of purported remedies, some of which contained such dangerous components as mercury, alcohol, or opium. By 1800, the United States had four medical schools, which were associated with the University of Pennsylvania, Columbia, Harvard, and Dartmouth. The number of medical schools expanded from five in 1810 to fifty-two by 1850. Some physicians attended formal medical colleges, such as the prestigious Jefferson Medical College in Philadelphia that was established in 1824. However, many medical practitioners learned their profession through apprenticeships or simply began practicing without any training. Massachusetts had delegated licensing to the state medical society, established in 1781, and leaving the matter to state medical societies became commonplace. Nurses and midwives were largely self-appointed. By 1850, there were about 41,000 physicians in the United States. This figure averaged out to about 176 physicians for every 100,000 people–a very high proportion by historical standards. The pay could be very good and it was relatively easy to become a doctor.

Scientific Advances

Medical science did advance, and knowledge of advances spread rapidly. Edward Jenner developed a successful vaccine for smallpox around 1800 in England. Massachusetts was quick to promote vaccination for smallpox, and New Hampshire required it from 1835. Smallpox was virtually wiped out as a result. Quinine was successfully produced in 1822 and became the standard treatment for malaria. Beginning during the early nineteenth century, gases such as ether and nitrous oxide were used as anesthetics; however, they were seldom used during the numerous amputations resulting from battle wounds in the U.S. Civil War. The American Medical Association (AMA) was formed in 1847 after half a century of growth of state and local societies. Some states and cities had similar groups much earlier. Medical journals spread information on treatments.

The discovery of germs (bacteria) and their role in infection and disease by Louis Pasteur in France during the middle of the nineteenth century profoundly improved medical science. Deaths from major infectious diseases such as tuberculosis, diphtheria, and measles accounted for about half of all deaths in the United States before 1880. From that point on, the death rate from infectious diseases began a rapid decline. Techniques of cleansing and sterilization, along with anesthetics, revolutionized surgery during the late nineteenth century.

Public Health

Health improvements in the nineteenth century resulted not so much from the improvement in medical treatment as from a general improvement in nutrition and living standards, and the spread of public health measures. In colonial times, numerous government units created boards of health, concerned with sanitary conditions and contagious diseases. The obvious filth and stench developing in urban slums focused public attention on the need to upgrade water supplies and waste-disposal systems, partly for aesthetic reasons. Very few cities had sanitary sewers before 1880; most constructed them between then and 1910. A filtered water supply was virtually unknown in 1880, but such supplies reached more than 10 million people by 1910, and some areas had introduced chlorination. Major cities established boards of health. Water was inspected for bacteria. Pasteurization of milk became widespread. Following the example of Providence, Rhode Island, in 1880, public health laboratories became widespread by 1914. School districts instituted physical examinations and enforced compulsory vaccinations. The public schools increasingly spread information from the rising field of home economics, stressing the value of cleanliness, diet, and exercise. These public health measures led to the creation of numerous companies devoted to testing for contaminants and producing the equipment needed to improve sanitation.

The Early Twentieth Century

The muckraking literature around the beginning of the twentieth century heavily criticized the unhealthy conditions in urban slums (Jacob Riis) and unsanitary conditions in food preparation (Upton Sinclair). A groundbreaking study of American medical schools by Abraham Flexner (1910) found wide variation in quality. At the top, Johns Hopkins University had developed the first really modern medical school (1893). At the bottom, Flexner recommended several schools be closed–and they were. State governments authorized their medical associations to approve medical schools and to examine and license physicians. These reforms greatly upgraded the quality of the medical profession; however, they also made medical education much more lengthy and expensive. There had been 162 medical schools in 1906; ten years later there were only 95, and the number fell further to 80 in 1923. Between 1900 and 1906, more than 5,000 students per year graduated from medical schools. After 1913, the number dropped below 4,000, and did not return to that level until 1927. In 1900-1906, there had been about 157 physicians for every 100,000 people, but this ratio dropped below 130 after 1923 and did not return to the previous level until the 1960’s. The shortfall in physician numbers was somewhat offset by the more rapid expansion in the number of professional nurses. Nurses numbered about 50,000 in 1910; this number doubled in 1920 and doubled again in 1930, reaching about 214,000.

Upton Sinclair’s novel The Jungle (1906), which exposed unsanitary conditions in the meatpacking industry, was a factor leading to passage of a law creating the federal Food and Drug Administration (FDA) in 1906. The same law required that products be accurately labeled and forbade certain dangerous ingredients. The importance of the FDA increased steadily in the following years.

For most of the first half of the twentieth century, American medical practice fell into a simple pattern. Most doctors were family doctors, operating as individual practitioners out of a small office, often in the doctor’s home, and seeing patients both in their office and at the patient’s home. Diagnostic instruments were simple–a stethoscope, a thermometer, perhaps a blood-pressure cuff, sometimes X-ray equipment.

Medical costs were not high. A visit to the doctor might cost $5. In 1929, Americans spent about $3 billion for medical care. Half of this went for physicians or dentists. About $400 million went to hospitals. The number of hospitals rose rapidly in the first quarter of the twentieth century, then leveled off for a long time at roughly 6,000.

In 1929, another $600 million went for medicines and other purchased medical items. The drugstore was a familiar Main Street establishment–there were about 58,000 of them during the 1920’s, often with a soda fountain, a prescription department, and many over-the-counter (OTC) medicines. Notable among these was aspirin, a proven pain reliever with near-miracle properties yet to be discovered. Miles Laboratories was a major supplier of OTC products, including Alka-Seltzer, whose comforting fizz promised relief for headaches or indigestion. Chain drugstores, such as Rexall, became widespread during the 1920’s.

World War II

The 1940’s represented a turning point in the American medical system. The discovery of penicillin and antibiotics generally revolutionized the treatment of infections. New treatments greatly improved survival rates among soldiers wounded in military conflict. The pharmaceutical industry stepped up its efforts in research and development.

Within the federal government, the National Institutes of Health (NIH), which had been operating on a modest scale since the 1930’s, experienced a rapid rise in its budget. NIH research expenditures rose from $33 million in 1952 to $274 million in 1960 and $893 million in 1969. The federal government created a cabinet-level Department of Health, Education, and Welfare (HEW) in 1953.

The war also set off major changes in the financing of medical expenses. Employers discovered they could bypass wage controls and high income tax rates by paying medical Insurance industryinsurance costs for their workers. In 1948, insurance paid about 6 percent of personal health care costs. The insurance share rose rapidly, reaching 27 percent in 1960. As a result, the ultimate consumers became less sensitized to costs. Prices of medical goods and services began to rise more rapidly than other prices. Between 1950 and 1970, consumer prices in general increased by 61 percent, but medical costs rose 125 percent.

Medicare and Medicaid

The federal government’s role in the medical world changed dramatically in 1965 with the creation of MedicareMedicare and MedicaidMedicaid. Medicare was a system of medical-expense insurance for people aged sixty-five and older. People became eligible either by paying Social Security tax (to which a Medicare premium was added) or by paying premiums directly. The adoption of Medicare had no appreciable effect on the health indicators of the elderly but greatly improved their financial condition. Medicaid covered medical expenses of eligible low-income persons of any age. About half of the people below the poverty line qualified for Medicaid.

The new federal programs encouraged the spread of health maintenance organizations (HMOs). These offered basic medical services to members for a fixed annual premium. In many cases, the HMO would pay its participating physicians a flat amount for each client enrolled. The Health Maintenance Organization Act of 1973Health Maintenance Organization Act of 1973 helped expand the scope of HMOs, viewed as an effective method of controlling costs through “managed care.”

Several health-related federal agencies were created: the Occupational Safety and Health Administration (OSHA, 1970), the Environmental Protection Agency (EPA, 1970), and the Consumer Product Safety Commission (CPSC, 1972). A symbol of the growing federal role was the creation in 1980 of a new Department of Health and Human Services, spun off from HEW.

With these new programs, the share of personal medical care expenditures in gross domestic product (GDP) moved steadily upward, from about 3.4 percent in 1960 to 6.6 percent in 1980 and 10 percent at the end of the millennium. Rising demand brought a steady increase in the number of medical schools and their graduates. In 1956, 82 medical schools produced about 7,000 graduates. By 1970, 107 medical schools produced almost 9,000 graduates. However, the supply did not keep up with the demand. As a result, physicians immigrated to the United States, and some cost-conscious Americans went to other countries for treatment.

The continued rapid rise in medical costs drove up insurance premiums. Many employers stopped offering health insurance or shifted more costs to employees. The plight of the medically uninsured became a significant political issue. During Bill Clinton’s first term as president, his wife, Hillary, tried unsuccessfully to put together a program to expand medical insurance provided by the federal government. In 1997, Congress did create the State Children’s Health Insurance Program, which substantially enlarged insurance coverage for children. A complex prescription drug benefit was added to Medicare effective in 2006.

The New Millennium

By 2000, the United States had developed a very large and diverse health care system. State and federal governments provided public health facilities such as a safe water supply, waste disposal, and inspection of goods, services, housing, and workplaces. Total health service employment was 9.3 million in 1990, increasing to 12.7 million in 2000 and 14.9 million in 2006. The number of physicians increased from 615,000 in 1990 to 814,000 in 2000 and 902,000 in 2005. By 2005, one-fourth of all physicians had attended foreign medical schools. There were 420 HMOs, enrolling about 69 million people. Personal health care expenditures were about $1.7 trillion, of which 85 percent was covered by third-party (chiefly insurance) sources. Fifteen percent of the population was not covered by medical insurance. Medicare covered 42 million people and Medicaid 38 million. Government programs of all kinds accounted for $747 billion of personal health care expenditures, representing about 44 percent of the total.

Major indicators of health showed steady improvement. Life expectancy at birth, which was about forty-seven years in 1900, rose to seventy-four years in 1980 and seventy-seven years in 2003. These figures are strongly influenced by lifestyle factors such as smoking, automobile accidents, and violence. A better indicator of medical effectiveness is the number of years a sixty-year-old person is expected to live, which rose from fifteen years in 1900 to twenty years in 1980 and twenty-two years in 2003. Infant mortality, which was a shocking 100 per thousand in 1915, dropped to 13 in 1980 and 7 in 2003.

Further Reading
  • Coddington, Dean C., Elizabeth A. Fischer, Keith D. Moore, and Richard L. Clark. Beyond Managed Care: How Consumers and Technology Are Changing the Future of Health Care. San Francisco: Jossey-Bass, 2000. Analysis of managed health care focuses primarily on implications for changes in health care policy but also provides a clearly written historical overview of HMOs.
  • Feldstein, Paul J. Health Care Economics. 5th ed. Albany, N.Y.: Delmar Publishers, 1999. This respected textbook has gone through several editions since 1973; the updates give good coverage of developments since that date.
  • Henderson, James W. Health Economics and Policy. 2d ed. Mason, Ohio: South-Western Publishing, 2002. This college-level text provides good coverage of the policy changes and their economic effects in the health care industry.
  • Meeker, Edward. “Medicine and Public Health.” In Encyclopedia of American Economic History, edited by Glenn Porter. New York: Charles Scribner’s Sons, 1980. Especially good on the evolution and importance of public health programs.
  • Rejda, George. Social Insurance and Economic Security. 6th ed. Upper Saddle River, N.J.: Prentice-Hall, 1999. This text for college undergraduates covers health problems and policies in chapters 7-8.
  • Shafer, Henry Burnell. The American Medical Profession, 1783-1850. New York: Columbia University Press, 1936. Although it is an older work, it is an excellent scholarly study of the late eighteenth and early nineteenth centuries.
  • Stevens, Rosemary E., Charles E. Rosenberg, and Lawton R. Burns, eds. History and Health Policy in the United States: Putting the Past Back In. New Brunswick, N.J.: Rutgers University Press, 2006. Collection of scholarly essays on the history of American health care policy includes substantial information on the origins of managed care and the role of Nixon in the development of the HMO industry.

Chemical industries

Child product safety laws

Food and Drug Administration

HealthSouth scandal

Insurance industry

Henry J. Kaiser

Medicare and Medicaid

Muckraking journalism

Occupational Safety and Health Act

United Food and Commercial Workers

Categories: History