Maybe living with grandma and grandpa isn’t such a bad thing after all. (svetikd/iStock)

A mom, a dad, 2.4 children, and an energetic but well-behaved dog compose what we long have recognized as the classic American household: a nuclear family nestled in a suburban bungalow, living on a street with similar houses that contain similar households. Grandma and Grandpa live on their own matching street somewhere over the river. When the kids reach adulthood, they will establish their own independent nuclear habitations.

We tend to see any deviation from that pattern as an unfortunate aberration, whether it’s the cohabitation of elderly grandparents who can no longer live independently or young-adult children experiencing a “failure to launch,” stuck in the basement. A Wall Street Journal headline recently rued that the “Percentage of Young Americans Living With Parents Rises to 75-Year High.” The New York Times fretted, “It’s Official: The Boomerang Kids Won’t Leave.” And the Fiscal Times warned: “The Kids Aren’t Alright.” When House Minority Leader Nancy Pelosi (D-Calif.) wanted to summon a worst-case scenario that could follow a repeal of President Barack Obama’s Affordable Care Act, she asked her fellow Americans: “You want Grandma living in the guest room?”

But is living in a household with grandparents, parents and kids really so terrible? Judging by the numbers, it can’t be all bad: More than 60 million Americans currently live in multigenerational households, the highest proportion since the Korean War. Demographic patterns indicate that share should continue to rise in the years ahead. And historical data suggests that the wholly independent nuclear-family household may be the aberration — that patterns of close familial support are the more natural arrangement.

Why are we so down on a practice that has been so common? Those sentiments flow from the peculiar history of postwar America, when nuclear-family households became the norm in spite of, well, everything.

Throughout the 20th century, several scholars claimed that nuclear households had been the historical standard, pre-dating postwar America. Indeed, according to University of Minnesota historian Steven Ruggles , “by the mid-1970s, the theory of long-run [nuclear] stability in Western family structure had found its way into every one of the basic sociology textbooks.”

But these days, that theory doesn’t seem to wash. Ruggles has found that multigenerational households were a nearly universal experience in mid-19th-century America and that “the great majority of families went through a multigenerational phase if the parents lived long enough.” It turns out that lifespans and large broods, not preferences, might explain why nuclear-family households appeared widespread before the postwar years: A parent of seven can, of course, reside in the home of only one adult child at a time; some of the adult children’s households would have been purely nuclear, but not necessarily by choice. And the demise of multigenerational households appears to have been less about shifting preferences than historical and political changes.

Most of the collapse in multigenerational arrangements took place in the four decades after World War II, when unique circumstances combined to transform the patterns of everyday American life. The tremendous demographic pressure built up by a generation raised during the Great Depression and then sent overseas to wage war was suddenly released, as servicemen returned home to settle down and seek quiet stability. The GI Bill sent many vets to college and provided housing subsidies that spuured construction of vast quantities of housing, quickly. The construction industry, which had been constrained by wartime supply rationing, was now encouraged by Federal Housing Administration programs and others that offered unprecedented subsidies and new, government-guaranteed 30-year fixed mortgages for single-family homes.

With a booming economy, Social Security in place and Medicare soon to come, Americans had incentives to follow an unusual pattern of generational segregation. The elderly became more financially independent and less reliant on their children as filial retirement accounts, while the young-adult generation took the plentiful jobs available outside any family business. The wealth transfers that once kept children close to their inheritance were reversed, and families were quickly spread across fresh suburbs that offered cheap access to new wealth. As Ruggles reflects, “Material conditions, family behavior, and attitudes were changing simultaneously, and it is likely that the changes were mutually reinforcing.” Multigenerational living reached its nadir in 1980, when only 15 percent of older Americans lived with their children, and only 12 percent of households overall contained multiple adult generations.

In the postwar years, new local ordinances also reinforced nuclear-family households. These laws didn’t intentionally target multigenerational arrangements, but the growth of rules built around one model of living crowded out others. Zoning codes initially written to keep industrial factories out of residential areas increasingly dictated what residences could be built in a neighborhood, and grew the distances between houses and between activities. This made neighborhoods less walkable — and thus less friendly to the youngest and oldest — and moved families farther apart.

The town of Urbana, Ill., illustrates how this unfolded, as the University of Chicago’s Emily Talen recounts in her book “City Rules.” Urbana’s first zoning ordinance, in 1936, allowed great flexibility. The 1950 code included six districts, two each for residential, business and industrial. In 1979, the town’s zoning code expanded to 16 districts and two overlay codes, banned apartments in single-family areas, and introduced minimum lot sizes and floor-area ratios that crowded out small additional living spaces. The multiplication of rules to keep renters out and enshrine the privileges of single-family homeowners gradually blocked multigenerational-friendly add-ons and neighborhood patterns.

Where once a non-driving resident, whether a child or an elder, could help around the house by walking to the corner store for a gallon of milk before supper, the expanding development as the 20th century went along separated a home from a supermarket with an interstate highway.

Today, those same legal frameworks constrain families seeking homes for their multigenerational households primarily by forbidding the construction of houses with dedicated spaces for live-in relatives.

For as much as some families want to live together, they recognize the need for a certain degree of separation and privacy to stay, well, sane. Amenities to help achieve that might take the form of a separate entrance or an extra kitchen. But this versatility is just what many local codes are written to prevent. That such “accessory units” will probably be occupied by people at a different stage of life or income level than the owners of the surrounding houses is seen as a threat to neighborhood stability and home values, rather than an opportunity.

Neighbors and planners complain about the potential of such units to degrade an area’s single-family character, to induce unsightly crowding or to overwhelm street parking, even though none of these concerns have sound empirical foundations. “Granny flat” expert Martin John Brown found that parking issues were the most frequently raised objection in public deliberations, despite a total absence of evidence, and Portland, Ore., builder and recent Harvard Loeb fellow Eli Spevak notes that his city, by far the leader in American accessory-unit construction, has not seen a public backlash in response to such units being built once city-wide approval was granted. Nevertheless, as the Wall Street Journal has reported, the growing number of Americans seeking to build accessory units onto their homes, even for family use, frequently are told by builders that they’ll never get the permits approved.

Some companies, such as industry giant Lennar, are beginning to specialize in constructing homes to serve multigenerational demand, but even these experienced builders frequently are refused permission unless they twist their designs into a conforming shape. The Journal recounts stories of builders being required to disguise a two-unit property behind a single front door or to strip out ovens or other kitchen appliances. Even cities that do allow such units, such as Columbia, Mo., often charge their owners large utility-impact fees equivalent to those levied on a whole new house. And some homeowners have discovered, neighbors often are at the front of the line to file complaints about perceived problems with aesthetics, parking or overcrowding that an accessory apartment could inflict on their single-family street.

And that’s a shame. There are good reasons to live multigenerationally, from the mundanity of sharing costs and chores to the sublimity of shared witness to each milestone of a child’s growth. At a time when parents are working longer hours and grandparents are seeing their retirement extend, the oldest sharing economy offers benefits to both. But thanks to our postwar legal and cultural heritage, many people still seem to fear Granny or Junior sticking around. They shouldn’t: Living with your parents, it seems, isn’t an aberration. It’s downright natural.

outlook@washpost.com

Read more from Outlook and follow our updates on Facebook and Twitter.