How America Got So Sick

How America Got So Sick

In the winter of 168 C.E., the famed Greek physician Galen arrived in Aquileia, an Italian city on the northern edge of the Adriatic. The city had grown large since its founding as a Roman colony, but during the 200-year Pax Romana, its fortifications had been allowed to deteriorate. After an armed group of migrating Germanic peoples had crossed the Danube a year earlier, the Roman co-emperors, Marcus Aurelius and Lucius Verus, had rushed to the city, raising two legions and rebuilding its defenses; they planned to use it as a base of operations against the invaders.

Explore the March 2026 Issue

Check out more from this issue and find your next story to read.

View More

Galen had been summoned, however, to help fight a different kind of invader. A plague, likely an early variant of smallpox, had traveled to Aquileia with the troops, and held the city in its grip. The emperors fled, but Verus succumbed to the disease on the road to Rome. Galen tried to slow the wave of illness, but most of the people in Aquileia perished.

They represented just a sliver of the eventual victims of the Antonine Plague, also known as Galen’s Plague, which killed at least 1 million people throughout the Roman empire. It was possibly the world’s first true pandemic, and haunted the empire for the rest of the Pax Romana, which ended in 180 with Aurelius’s death. The details of the pandemic—the exact pathogen, the true number of victims—are subjects of debate, and might never be fully settled. But some research has cited the Antonine Plague as part of a vicious cycle that hastened Rome’s long fall. Food shortages, internal migrations, and overcrowding had already signaled a slippage in imperial power, and created a fertile environment for disease. The pandemic, in turn, spread panic and left behind mistrust, weakening faith in civic and religious authorities.

Men famously think about Rome every day, and political commentators have been nervously comparing Rome’s fall to a potential American collapse since before America even had a Constitution. But Rome’s example really does merit consideration in light of recent events. One of the better measures of a society’s vitality is its ability to protect its citizens from disease, and the two often move in tandem; a decline in one may produce a reduction in the other.

Infectious disease is probably not an imminent threat to the United States’ survival. Still, after nearly a century of existence, the American public-health apparatus, which has driven some of the most remarkable advances in global longevity and quality of life in human history, is teetering. The country has lost much of its ability to keep microbes from invading its body politic, and progress in life expectancy and other metrics is slowing or even reversing.

It is tempting to lay these changes all at the feet of President Trump and his current health secretary, Robert F. Kennedy Jr., who together have shredded America’s global-health organizations, drawn back public-health funding, fomented vaccine skepticism, and begun to dismantle child-vaccination programs. But the “Make America Healthy Again” moment is in some ways just another step in the long retreat of the civic trust and communitarian spirit that have enabled America’s disease-fighting efforts. If this retreat continues, the public-health era—the century-long period of unprecedented epidemiological safety that has been the foundation for so many other breakthroughs—will come to an end. And that end will have dire consequences for this republic and its future.

In January 2025, a hospital in West Texas began reporting that children were coming in sick with measles. The cases were initially clustered in a Mennonite community, where vaccination rates had been low in recent decades. But soon the outbreak spread around the state, and to others; the reported number of cases reached more than 1,800 by the year’s end. As of this writing, the outbreak is still ongoing, and America is in danger of having its measles-elimination status revoked by the World Health Organization.

On August 8, as the measles outbreak continued to make headlines, a man named Patrick Joseph White entered a CVS in northeast Atlanta and fired hundreds of rounds from a rifle into the CDC’s headquarters across the street. According to Georgia investigators, White had been suicidal, and believed that COVID‑19 vaccines were part of a conspiracy to sicken him and other Americans.

These were but two signs among many that something has broken within the systems that protect the population’s health. Despite all of our advantages, the coronavirus pandemic caused more confirmed deaths per capita in the United States than in any other Western country, and our mortality rate’s recovery has lagged behind others’. Life expectancy in the U.S. is lower than in other high-income nations, and the gulf is widening.

America is unique, and comparisons are difficult. The country easily outpaces the rest of the developed world in gun deaths and overdoses, both major mortality drivers here that have largely been accepted as the cost of being American. But even if you discount those peculiarities, plenty of other indicators are pointing the wrong way. Foodborne illnesses appear to be on the rise, including regular surges of norovirus. Deteriorating water-delivery and sewage systems have contributed to a growing number of outbreaks of legionella. Cases of tetanus, whooping cough, and hepatitis A have also risen in recent years.

Many problems contribute to these shifts—insufficient investments in infrastructure, budget cuts in state and local health departments, the growing drug resistance of bacteria. Yet underlying all of the outbreaks, and even gun and opioid deaths, is a common theme: a declining sense of mutual responsibility among Americans. If the population could be analogized to a single human body, then its immune system would rely on a concert of action and purpose between each cell. When that concert stops, the body dies.

In 1946, the year the U.S. Public Health Service founded its Communicable Disease Center, American life expectancy at birth was about 66 years. Malaria was rampant in the South, and fever diseases, tuberculosis, syphilis, and polio killed tens of thousands of Americans annually. Thirty-four out of every 1,000 children born in 1946 were expected to die before their first birthday, many from communicable diseases. America was moving toward modernity, but the risks people faced were of a different order than they are today.

The CDC (since renamed the Centers for Disease Control and Prevention) inherited much of its early mandate from a U.S. military campaign to control infectious diseases among soldiers fighting in World War II. The scale of the war effort had necessitated the creation of a health infrastructure on American soil—spraying for mosquitoes near the front lines in the Pacific wouldn’t mean anything if soldiers caught malaria at home before deployment. Responses to outbreaks near bases needed to be big and fast enough to account for car travel beyond military jurisdictions. When the CDC took over, it extended this paradigm—of coordination across long distances and disparate communities—to the civilian population.

The same year the CDC was created, the influenza vaccine reached the public, and international organizations, supported by the U.S., began a global push to eliminate tuberculosis. The agency worked to promote mass vaccination. It began a national disease-surveillance program, and shared intelligence with cash-strapped county health departments and state agencies. Wartime campaigns to coax and chide Americans into doing their part to conserve resources and volunteer for the war effort translated easily into pushes for vaccination and sanitation.

Before 1946, conquering disease would have seemed as much a subject of science fiction as putting a man on the moon. But since 1950, global life expectancy has risen by four years each decade. Smallpox has been eradicated, and polio and malaria cases have dramatically fallen. Within the past 80 years, there have perhaps been more significant advances in human health than there were in the previous 300,000.

On the home front, several generations have grown up on an American mainland without malaria, yellow fever, or typhoid fever; diseases like dysentery are medical rarities. Measles and polio, once routine scourges of childhood, were pushed back by millions of vaccinations. Life expectancy increased by more than a decade, to 78 in 2023. This was a public-health revolution, on equal footing with any of the great agricultural, industrial, or information revolutions that have punctuated the past few centuries.

Those other great revolutions are often considered to be the result of technological advances—the plow, steam power, fertilizers, the internet. And certainly, the development of vaccines, antibiotics, and other medicines has played a tremendous role in the advance of human health. But vaccines for smallpox and some other diseases had been around for at least half a century before the 1940s, and had failed to create widespread immunity. The real public-health revolution was first and foremost a change in the way people thought about themselves and their relationship to one another.

Epidemiology made a new kind of thinking necessary. Pathogens respect neither individuals nor borders. Vaccinations and other preventatives against ever-evolving germs do not on their own guarantee personal safety—only eradication can do that. And eradication, it came to be understood, can be achieved only through local and global cooperation.

In America, where capitalist and individualist ethics have always predominated, public health nonetheless managed to carve out a large cooperative space. Before the 1940s, the United States was still reporting a relatively high number of smallpox cases compared with other similarly industrialized nations; it achieved total elimination in 1949. With the insistence of a growing public-health apparatus, it became common practice to wash our hands, to cover our mouths, to not smoke indoors, and to get tested—not just for our own benefit, but for the sake of the people around us. Parents waited in long lines to have their children inoculated, and enterprising physicians went to rural clinics to reach the last isolated clusters of unvaccinated people.

That is not to say America’s particular system of public health was ever perfect. Owing partly to the legacy of segregation, the country never developed a universal health-insurance program, and maintains a fragmented health-care system in which both class and race still dictate much of a patient’s access to care. Many people on the margins who have wanted to get screened for certain diseases or vaccinated against them have not been able to do so, because they cannot afford to or because no doctor will serve them.

And yet, sometimes through the insistence of those same people that America live up to the tenets of public health, the system has come closer to the ideal. As much as any other institution—schools, libraries, churches—the public-health system has helped propagate the idea of a commons, often working against historical inertia to curb the excesses of American individualism. That work has always required energy and effort from the people. And so it has always been vulnerable, because that energy and effort could dissipate at any time.

There is ample evidence that this is exactly what is happening. According to the health-policy organization KFF, in the summer of 2025 just 83 percent of parents kept their children up to date on vaccines, down from 90 percent four years earlier. Cases are surging for several of the diseases covered by the national vaccine schedule. Tuberculosis cases are higher than they have been in a dozen years, and meningococcal disease is rising as well. Measles cases have trended upward for years too, even before 2025.

Over the past 50 years, American trust in the medical system has declined, as has trust in government, science, and expertise in general. The coronavirus pandemic exploded those trends, creating the world in which we now find ourselves. Public-health agencies did themselves no favors: They often gave out confusing and sometimes conflicting advice. Conspiracy theories grew quickly on social media, and measures such as masking became subject to partisan polarization. According to Gallup, a bare majority—just 51 percent—of Americans now favors government requirements for vaccines, down from 81 percent in 1991 and 62 percent in 2019. Most of the slippage has been among conservatives, and studies suggest that political ideology is perhaps the biggest predictor of vaccine rejection.

Medicine has kept moving forward, with some truly great results. Deaths in the U.S. from cardiovascular disease are plummeting, and might see further declines with the advance of GLP‑1 drugs. With the advent of better cancer-screening tools, survival rates are improving, and wonder-drug therapeutics for many conditions are now on the market. But personalized care of this sort is expensive, and does not keep us collectively safe from infectious disease.

Meanwhile, as viruses that once killed hundreds of thousands have receded from public memory, they have come to seem less fearsome. Owing to the near eradication of some diseases, there have been few real risks to the heretofore small portion of people who refuse vaccines. In this landscape, organizations such as the CDC, which once stood as unimpeachable examples of government competence, have become victims of their own success, appearing to skeptics to be inert or irrelevant.

This was the system as Trump and Kennedy found it last year, vulnerable and stripped of the halo of public trust. Kennedy slashed agency budgets and stocked a key vaccine advisory committee with vaccine skeptics, then this past January announced a new set of childhood-vaccine recommendations that excluded coverage for rotavirus, influenza, and hepatitis A, which all now cannot be administered to most patients without a doctor’s consultation.

Kennedy’s biggest threat to public health comes from what he symbolizes. The MAHA movement derides expertise, overemphasizes personal commitment and liberty, and has embraced pseudoscience. This stance, mingled with Trump world’s conspiratorial tendencies, has turned the CDC and other once-trusted institutions into targets. After the August shooting at CDC headquarters, hundreds of current and former Health and Human Services employees singled out Kennedy as a driver of the kind of rhetoric that had motivated Patrick Joseph White, referring to the secretary’s previous insinuations that the CDC itself was hiding information about the risks of COVID vaccines.

Marcus Aurelius, the surviving Roman emperor, is mostly famous in our time because of his Stoicism. His philosophy encouraged the embrace of duty, not because of the expectation of praise or other material benefits but because duty is in itself fulfillment of the human condition. In his Meditations, he offered a maxim: “Do your duty—whether shivering or warm, never mind; heavy-eyed, or with your fill of sleep; in evil report or in good report; dying or with other work in hand.”

It’s hard to psychoanalyze a guy who lived two millennia ago, but it’s easy to believe that this particular admonishment may have come from his time as a plague fighter. In the face of Galen’s “everlasting pestilence,” Marcus had to rally the public and improvise, stocking depleted armies with convicts and ordering the digging of mass graves. He saw that the state was held up not just by the military or territory, but by invisible webs of shared sacrifice and obligation. In the end, the fortifications that mattered most were those that strengthened Rome against the invaders that could not be seen.

If the American state disintegrates, future postmortems are unlikely to focus much on measles, or on rotavirus vaccination rates. But the ability to beat back our more routine pathological menaces is a good indicator of the country’s ability to take on bigger, more virulent threats. The thing about bacteria and viruses, our most ancient foes, is that they are always at the gates, waiting for lean times. Among them will be pathogens worse than the coronavirus.

In the main, the withering of public health might not anticipate a future apocalypse so much as it recalls a previous America, one where lives were cheaper and shorter, where good health was the province of a privileged few, and where epidemics regularly scoured the countryside and the city slums. What’s spurring the slide now isn’t a dearth of information or cutting-edge medicine. Rather, the precepts of a shared reality have been shattered, and with them the ability to act for a common cause.


This article appears in the March 2026 print edition with the headline “How America Got So Sick.”

Source link