The Longevity Revolution
July 1977: A 105-degree afternoon in Phoenix. I’m 17 and making deliveries in an underpowered Chevette with “4-55” air-conditioning (four open windows at 55 miles per hour), so I welcome the long runs to Sun City, when I can let desert air and American Top 40 blast through the car. Arrival, though, always gives me the creeps. The world’s first “active retirement community” is city-size (it would eventually span more than 14 square miles and house more than 40,000 people). The concentric circles of almost-identical tract houses stretch as far as I can see. Signs and bulletin boards announce limitless options for entertainment, shopping, fitness, tennis, golf, shuffleboard—every kind of amenity.
Sun City is a retirement nirvana, a suburban dreamscape for a class of people who, only a generation before, were typically isolated, institutionalized, or crammed into their kids’ overcrowded apartments. But I drive for blocks without seeing anyone jumping rope or playing tag (no children live here). I see no street life, unless you count residents driving golf carts, the preferred form of local transportation. My teenage self wonders: Is this twilight zone my eventual destiny? Is this what it means to be old, to be retired, in America?
In its day, Sun City represented a breakthrough in American life. When it opened, in 1960, thousands of people lined up their cars along Grand Avenue to gawk at the model homes. Del Webb, the visionary developer, understood that the United States was ready to imagine a whole new stage of life—the golden years, as marketers proclaimed them.
A cultural revolution was in full swing. Social Security and private pensions had liberated tens of millions of older Americans from poverty and dependency; modern medicine had given them the health to enjoy what was then a new lifestyle: leisure. In 1965, Medicare ameliorated the old-age fear of medical bankruptcy. In 1972, President Richard Nixon and the Democratic Congress, outbidding each other for the senior vote, increased Social Security by 20 percent and indexed it to keep up with inflation. With these two programs on fiscal autopilot, the entitlement state was born, and the elderly were its prime beneficiaries.
When I gazed at Sun City, I was seeing the embodiment of the U.S. government’s greatest 20th-century domestic achievement: the near elimination of destitution among the elderly. By 1977, the poverty rate among those 65 and older had fallen from almost 30 percent in the mid-1960s to half that level. In 2022, it was 10.9 percent, according to the Census Bureau, slightly below the poverty rate for those ages 18 to 64 (11.7 percent)—and very significantly below the poverty rate among children and youth (16.3 percent).
“The struggle chronicled in this book—the struggle to build a secure old age for all—has been in many ways successful,” James Chappel writes in Golden Years: How Americans Invented and Reinvented Old Age. For most seniors, life is “immeasurably better” than it was a century ago. But he and Andrew J. Scott, the author of The Longevity Imperative: How to Build a Healthier and More Productive Society to Support Our Longer Lives, agree that the ’60s model of retirement needs updating in the face of new demographic, fiscal, and social realities. What comes next?
For clues, Chappel, a historian at Duke University, looks to the past, tracing the 100-year evolution of Americans’ notions of aging. He proceeds from the clarifying premise that aging is as much a social phenomenon as it is a biological one—perhaps even more so. “There is no ‘natural’ way to age—we have to be taught, by our cultural and political and religious institutions, how to do it well.”
[From the December 2014 issue: Jonathan Rauch on the real roots of midlife crisis]
Today’s conceptions of old age and retirement are modern inventions. In 19th-century America, Chappel writes, “the presumption was that ‘old age’ was not a long phase of life that began at sixty-five, but a short one that was marked by disability and decline … Basically, older people were to seek contemplation and tranquility.” In the mid-1800s, the average 30-year-old could expect to live only about 30 more years. That began to change as the fruits of industrialization and science ripened. As more people lived to become old, social activists mobilized for pensions, led by Civil War veterans. Now forgotten, the National Ex-Slave Mutual Relief, Bounty, and Pension Association argued in the 1890s for pensions as a form of reparations for the formerly enslaved. No one today will be surprised to learn that this group was suppressed, its proposals were buried, and its leader, Callie House, found herself in jail on trumped-up charges.
Still, the movement to end dependency and penury in old age gathered force and triumphed with the enactment of Social Security in 1935, the crowning achievement of the New Deal. Although its initial design favored men over women, white people over Black, and industrial over agricultural workers, it laid the foundation for the concept of retirement that made 65 officially old. Senior citizen replaced aged in the lexicon, and seniors became a self-aware identity group. The decades that followed brought rapid expansion of elder benefits and programs, and with it a far-flung social infrastructure: senior centers and retirement communities; continuous-care and assisted-living facilities; educational and recreational opportunities, such as Osher Lifelong Learning Institutes and Elderhostel (now Road Scholar); and, not least, AARP (originally the American Association of Retired Persons), a marketing juggernaut and among the largest and most powerful lobbying groups ever.
But today, Chappel argues, progress toward a healthier, more secure, and more inclusive concept of old age has stalled, largely because the U.S. government has stalled. Though private activism and inventive experiments continue, “they will always be insufficient in the absence of aggressive state action.” A parsimonious Congress looks for budget cuts while conservatives push to privatize Social Security and Medicare. Just as worrisome, in Chappel’s view: Older Americans have embraced the idea that they are the same as younger people, except older—a vision that blurs the distinctive needs of elders and undercuts their identity-based activism. What’s called for, he suggests, is an ambitious expansion of the welfare state to cover unmet necessities, such as long-term care.
[Read: The kind of smarts you don’t find in young people]
This raises some questions. For one, who will pay for expensive new government programs? Social Security and Medicare are rapidly headed for insolvency and already hold the rest of the federal budget in a tightening vise. “The entire long-term deficit growth is driven by Social Security, Medicare, and the interest cost of their shortfalls,” Brian Riedl, a budget analyst with the Manhattan Institute, a center-right think tank, has written. Chappel breezes past any such fiscal concerns.
Even more puzzling, he does not pause to consider why further subsidizing the elderly should be the country’s top public-policy priority. He notes in passing that children are poorer than seniors, but he waves away the subject of generational equity, saying that “security is not a scarce resource” and dismissing as “vicious” a 1988 New Republic article, by the late Henry Fairlie, arguing that to seriously address competing social priorities, “we must shake off the peculiar notion … that old age is a time in which people are entitled to be rewarded.”
Chappel is not a policy wonk; as history, his book is valuable and authoritative. Perhaps it is not a historian’s job to answer philosophical questions about generational equity, political questions about hard choices, or fiscal questions about affordability. Still, one wishes he had at least teed them up, because they are unavoidable. Fortunately, Scott addresses them in The Longevity Imperative. An economist at London Business School, he identifies two longevity revolutions. The first has already arrived and, for all its multifaceted implications, is simply stated: Most people grow old.
Of course, old age as such is not new, but until quite recently, comparatively few people lived to see it. Life expectancy at birth was 18 years in the early Bronze Age, 22 in the Roman empire, and 36 in Massachusetts in 1776. It’s 77.5 years in the U.S. today, according to the National Center for Health Statistics. Those averages include child mortality, which partly accounts for shorter lifespans in earlier epochs. Even excluding child mortality, though, the improvements in longevity are astounding. Since the 1880s, so-called best-practice life expectancy—how long you’ll live if you do everything right and receive good health care—has increased, on average, by two to three years every decade. By now, the average American 65-year-old can expect to live another 18.5 years. Eighty is the new 68, inasmuch as the mortality rate of 80-year-old American women in 2019 was the same as that of 68-year-old women in 1933. An American child born today has a better-than-even chance of living to age 95. The first person to live to age 150 may have already been born.
[From the October 2014 issue: What happens when we all live to 100?]
Yet that triumph poses a challenge. The first longevity revolution “was about getting the majority to reach old age; the second will be about changes in how we age,” Scott writes. Will those additional years be vigorous and healthy? Or will they be filled with chronic illness and frailty? Will society capture the creative and productive potential of its rapidly expanding older population? Or will ageism and archaic conventions waste that potential? Scott makes an optimistic case that the second longevity revolution presents an opportunity to “rethink the way we live our whole life. Right now, though, we are not set to reap the benefit of these longer lives.”
The core problem today, he argues, is that lifespan outruns health span. In other words, not all of the years we add are healthy ones. The time has come for an ambitious, all-of-society effort to close that gap. Health-care priorities should shift more toward prevention, which today receives only 3 percent of U.S. health-care spending. Public-health measures should help further reduce smoking, alcoholism, obesity, and social isolation. More research dollars should flow to slowing the biological aging process, as well as treating frailty and disease.
The second longevity revolution will also require new institutions, expectations, and attitudes. With millions of people living vigorously into their 80s and beyond, the very idea of “retirement”—the expectation that people will leave the workforce at an arbitrary age—makes no sense. In fact, out the window goes the whole three-stage structure of American life, with education crammed into the first couple of decades, work heaped in the middle, and leisure stuck at the end. Jobs need to be made more friendly to older workers (through measures as elaborate as shifting physical tasks to robots and as simple as providing different footwear and chairs); employers need to exploit age diversity (which improves team productivity by blending older workers’ experience and skill with younger workers’ creativity and drive); education and training need to be available and encouraged throughout life. “The key is to see aging as a state of flux involving us all and not an event or a state that segregates one group from another,” Scott writes. Accordingly, he rejects the entire premise of age-based entitlements: “Tax breaks and other benefits should not be distributed simply because people reach a certain age.” (Henry Fairlie, call your office!)
Some of these changes are expensive, complex, or controversial, but Scott is right to argue that the really big barrier lies in American culture’s relentless negativity about aging. “Debate about an aging society rarely goes beyond mention of spiraling health costs, a pensions crisis, dementia and care homes,” he writes. “It is never seen as exciting, challenging or interesting.”
Reading Scott’s book together with Chappel’s can be whiplash-inducing, because they are in many respects antithetical. Where Chappel seeks to reinforce the country’s commitment to retirement security, Scott challenges the very concept of retirement; where Chappel endorses age-based programs and politics, Scott wants to erase age boundaries and base policies on individuals’ needs and abilities; and where Chappel sounds downbeat about aging in the United States—emphasizing that “many older Americans are in trouble” as they juggle the costs of medicine, housing, and especially long-term care—Scott emphasizes the unprecedented opportunities that the longevity revolution affords.
There is truth in both authors’ views (as they would probably agree). Supporting a rapidly growing aging population poses some daunting challenges, most notably in improving the country’s fragmented provision of long-term care. Yet Scott’s perspective is, I think, closer to the mark. The Sun City idea of aging and retirement is no longer either affordable or desirable as a template; viewing “the elderly” as an identity category makes little sense at a time when living to 85 is commonplace and some 85-year-olds are as vigorous as many 65-year-olds. Now on the doorstep of routine 100-year lifespans, America needs to rethink the meaning of school, work, and retirement—and what it even means to be old.
[Read: The future of work is a 60-year career]
I’ll propose, however, a friendly amendment to Scott. He envisions a world where boundaries in life are decoupled from age; what matters is what you can do, not how old you are. But the big conceptual categories of childhood, adolescence, middle age, and old age are too deeply rooted to toss aside. We could use a new category, one reflecting the fact that longevity is inserting one, two, or even three decades between middle age and old age.
As it happens, such a category is available: late adulthood. Associated with such thinkers as the sociologist Phyllis Moen, the psychologist Laura Carstensen, the social entrepreneurs Chip Conley and Marc Freedman, and the activist and writer Ashton Applewhite, the notion of late adulthood captures the reality of a new stage of life, in which many people are neither fully retired nor conventionally employed—a phase when people can seek new pursuits, take “not so hard” jobs, and give back to their communities, their families, and their God.
And no, this is not a pipe dream. Copious evidence shows that most of what people think they know about life after 50 is wrong. Aging per se (as distinct from sickness or frailty) is not a process of uniform decline. It brings gains, too: greater equanimity, more emotional resilience, and what Carstensen and others have called the positivity effect, a heightened appreciation of life’s blessings. Partly for that reason, the later decades of life are, on average, not the saddest but the happiest. Contrary to popular belief, aging does not bring mental stagnation. Older people can learn and create, although their styles of learning and creativity are different than in younger years. Emotional development and maturation continue right through the end of life. And aging can bring wisdom—the ability to rise above self-centered viewpoints, master turbulent emotions, and solve life’s problems—a boon not only to the wise but to everyone around them.
Late adulthood is a time when the prospects for earning diminish but the potential for grandparenting, mentoring, and volunteering peaks. It is—or can be—a time of reorientation and relaunch, a time when zero-sum goals such as social competition and personal ambition yield to positive-sum pursuits such as building community and nurturing relationships.
If anything, Scott undersells the second longevity revolution. Right now, Americans are receiving more than a decade of additional time in the most satisfying and prosocial period of life. This is potentially the greatest gift any generation of humans has ever received. The question is whether we will grasp it.
This article appears in the January 2025 print edition with the headline “The Longevity Revolution.”