Sign up for your FREE personalized newsletter featuring insights, trends, and news for America's aging Baby Boomers

Newsletter
New

Here’s How We Know Rfk Jr. Is Wrong About Vaccines

Card image cap

When I was taking German in college in the early years of this millennium, I once stumbled upon a word that appeared foreign even when translated into English: Diphtherie, or diphtheria. “What’s diphtheria?” I wondered, having never encountered a single soul afflicted by this disease.

Diphtheria, once known as the “strangling angel,” was a leading killer of children into the early 20th century. The bacterial infection destroys the lining of the throat, forming a layer of dead, leathery tissue that can cause death by suffocation. The disease left no corner of society untouched: Diphtheria killed Queen Victoria’s daughter, and the children of Presidents Lincoln, Garfield, and Cleveland. Parents used to speak of their first and second families, an elderly woman in Ottawa recalled, because diphtheria had swept through and all their children died.

Today, diphtheria has been so thoroughly forgotten that someone like me, born some 60 years after the invention of a diphtheria vaccine, might have no inkling of the fear it once inspired. If you have encountered diphtheria outside of the historical context, it’s likely because you have scrutinized a childhood immunization schedule: It is the “D” in the DTaP vaccine.

Vaccine breakthroughs over the past two centuries have cumulatively made the modern world a far more hospitable place to be born. For most of human history, half of all children died before reaching age 15; that number is down to just 4 percent worldwide, and far lower in developed countries, with vaccines one of the major drivers of improved life expectancy. “As a child,” the vaccine scientist Stanley Plotkin, now 92, told me, “I had several infectious diseases that almost killed me.” He ticked them off: pertussis, influenza, pneumococcal pneumonia—all of which children today are routinely vaccinated against.

But the success of vaccines has also allowed for a modern amnesia about the level of past human suffering. In a world where the ravages of polio or measles are remote, the risks of vaccines—whether imagined, or real but minute—are able to loom much larger in the minds of parents. This is the space exploited by Robert F. Kennedy Jr., one of the nation’s foremost anti-vaccine activists and now nominee for secretary of Health and Human Services. It is a stunning reversal of fortune for a man relegated to the fringes of the Democratic Party just last year. And it is also a reversal for Donald Trump, who might have flirted with anti-vaccine rhetoric in the past but also presided over a record-breaking race to create a COVID vaccine. Kennedy has promised that he would not yank vaccines off the market, but his nomination normalizes and emboldens the anti-vaccine movement. The danger now is that diseases confined to the past become diseases of the future.


Walt Orenstein trained as a pediatrician in the 1970s, when he often saw children with meningitis—a dangerous infection of membranes around the brain—that can be caused by a bacterium called Haemophilus influenzae type b or Hib. (Despite the name, it is not related to the influenza virus.) “I remember doing loads of spinal taps,” he told me, to diagnose the disease. The advent of a Hib vaccine in the 1980s virtually wiped these infections out; babies are now routinely vaccinated in the first 15 months of life. “It’s amazing there are people today calling themselves pediatricians who have never seen a case of Hib,” he says. He remembers rotavirus, too, back when it used to cause about half of all hospitalizations for diarrhea in kids under 5. “People used to say, ‘Don’t get the infant ward during diarrhea season,’” Orenstein told me. But in the 2000s, the introduction of rotavirus vaccines for babies six months and younger sharply curtailed hospitalizations.

To Orenstein, it is important that the current rotavirus vaccine has proved effective but also safe. An older rotavirus vaccine was taken off the market in 1999 when regulators learned that it gave babies an up to one-in-10,000 chance of developing a serious but usually treatable bowel obstruction called intussusception. The benefits arguably still outweighed the risks—about one in 50 babies infected with rotavirus need hospitalization—but the United States has a high bar for vaccine safety. Similarly, the U.S. switched from an oral polio vaccine containing live, weakened virus—which had a one in 2.4 million chance of causing paralysis—to a more expensive but safer shot made with inactivated viruses that cannot cause disease. No vaccine is perfect, says Gregory Poland, a vaccinologist and the president of the Atria Academy of Science & Medicine, who himself developed severe tinnitus after getting the COVID vaccine. “There will always be risks,” he told me, and he acknowledges the need to speak candidly about them. But vaccine recommendations are based on benefits that are “overwhelming” compared with their risks, he said.

The success of childhood vaccination has a perverse effect of making the benefits of these vaccines invisible. Let’s put it this way: If everyone around me is vaccinated for diphtheria but I am not, I still have virtually no chance of contracting it. There is simply no one to give it to me. This protection is also known as “herd immunity” or “community protection.” But that logic falls apart when vaccination rates slip, and the bubble of protective immunity dissolves. The impact won’t be immediate. “If we stopped vaccinating today, we wouldn’t get outbreaks tomorrow,” Orenstein said. In time, though, all-but-forgotten diseases could once again find a foothold, sickening those who chose not to be vaccinated but also those who could not be vaccinated, such as people with certain medical conditions and newborns too young for shots. In aggregate, individual decisions to refuse vaccines end up having far-reaching consequences.

Evolutionary biologists have argued that plague and pestilence rose in tandem with human civilization. Before humans built cities, back when we still lived in small bands of hunter-gatherers, a novel virus—say, from a bat—might tear through a group only to reach a dead end once everyone was immune or deceased. With no one else to infect, such a virus will burn itself out. Only when humans started clustering in large cities could certain viruses keep finding new susceptibles—babies or new migrants with no immunity, people with waning immunity—and smolder on and on and on. Infectious disease, you might then say, is a necessary condition of living in a society.

But human ingenuity has handed us a cheat code: Vaccines now allow us to enjoy the benefits of fellow humanity while preventing the constant exchange of deadly pathogens. And vaccines can, through the power of herd immunity, protect even those who are too young or too sick to be effectively vaccinated themselves. When we get vaccinated, or don’t, our decisions ricochet through the lives of others. Vaccines make us responsible for more than ourselves. And is that not what it means to live in a society?


Recent