For most of the nation’s history, the idea that people over the age of 65 would voluntarily herd themselves into special communities built around their needs would have seemed absurd, even dystopian. Yet a largely voluntary movement toward segregating people by age has reached extreme levels in recent years — and without receiving much attention at all.

The coronavirus outbreak could put an end to it.

In 1850, nearly 70% of individuals age 65 or older lived with their adult children. Most of the rest tended to live in geographical proximity. As a consequence, older people were more or less evenly distributed throughout the country.

This arrangement was highly functional: The elderly needed help as they aged, and children and grandchildren provided it. In return, the elderly took care of young children and otherwise pulled their weight around the house.

Home was not the only place where people of different ages mixed together in ways that are all too rare today. Before the 20th century, it was entirely normal to have a one-room schoolhouse catering to both teens and toddlers. When rural communities held quilting parties, everyone from young girls to elderly matrons participated side by side. Farmworkers of all ages toiled together, and armies in the Civil War threw together young boys, older men and everyone in between.

This was a world with very limited “age consciousness.” Almost no one drew attention to their age, even on their birthdays, a ritual that took off in the 20th century. As countries like the U.S. industrialized, new institutions began sorting citizens into different age buckets. Most important, schools began catering to discrete age cohorts — elementary, junior high, senior high.

As historian Howard Chudacoff has shown, much of this shift coincided with the invention of new terms to define and distinguish age groups. The idea of “middle-age,” for example, was a product of this shift, as was the invention of “pediatrics” as a field of medicine. It was perhaps inevitable that the elderly would get lumped into their own cohort, with a new field of medicine — geriatrics — invented to tend to their needs.

Several developments fueled this trend. The first was a growing belief that older people couldn’t keep up in the fast-paced, modern world of work. Mandatory retirement ages — often coupled with increasingly generous pension benefits — helped push workers out the door at a certain age. When Congress passed the Social Security Act in 1935, it elevated a new threshold to almost totemic significance: 65.

All of this took place against a very gradual decline in the number of old people living with their children. By the 1930s, the percentage of elderly whites living with their children declined to just under 40%; by century’s end, it had fallen all the way to 13%.

Why this happened is the matter of some debate. The mix of public and private retirement programs enabled some of the elderly to live on their own, but there’s evidence that in many cases, children moved away from their parents to pursue economic opportunities, effectively abandoning the older generation.

So the elderly, particularly those with retirement savings, embraced a new trend that burst onto the scene after World War II: the retirement community. In those prosperous decades, it became a symbol of the good life as potent as a suburban home with a white picket fence. The first was built in 1954 outside of Phoenix. Its name? Youngtown.

It offered a model for all the big retirement communities to come: Sun City, which opened nearby six years later; TV (“The Villages”), a sprawling development in central Florida founded soon afterward; and many more. These gated communities deliberately excluded younger people (Youngtown did not allow children under 18 to live there for longer than 90 days). But this meant there was no need to pay taxes for schools.

This movement exploded in the succeeding decades. But not everyone was wealthy enough to afford such amenities. Others weren’t well enough. In 1965, Congress created Medicare and Medicaid, helping finance the creation of low-budget, state-run “nursing homes” that increasingly warehoused the elderly.

These developments led to older generations living apart from everyone else. Though this took place in other developed nations, the U.S. was particularly committed to the effort. By the 1990s, a growing number of facilities designed to bridge the gap between fun-filled retirement centers and grim nursing homes came into being: the rise of “assisted-living” facilities, “memory-care villages” and other facilities for the elderly.

Ultimately, the U.S. became one of the most age-segregated nations in the world. Recent research indicates that a third of Americans older than 55 live exclusively among people in the same age cohort.

There are many reasons why this trend is problematic. A growing body of research suggests segregating people by age isn’t healthy for anyone, young or old, and that it has helped fuel divisions in the nation’s politics: When generations live apart, political polarization follows — the 2016 election comes to mind. But these concerns, rarely articulated, haven’t come close to raising societal alarms.

The pandemic may change that. Our most vulnerable members of society are concentrated into communities and institutions that, once infected, can easily turn into catastrophes. These are places where people live in close quarters, sharing meals, socializing, and otherwise living in ways that are apt to facilitate the spread of the virus.

Indeed, while much of our attention is now focused on large-scale outbreaks in cities like New York, the next wave may be dominated by a smaller, but proportionally more lethal, outbreaks throughout the nation’s elderly enclaves. There are signs this is already happening as daily tolls begin to break out deaths at nursing homes.

Such a disaster may finally make us question why we ever thought it was a good idea to so thoroughly segregate ourselves by age. If so, something good may come of this pandemic yet.

Stephen Mihm is an associate professor of history at the University of Georgia.