President Richard M. Nixon signs the Constitution's 26th Amendment, which guarantees 18-year-olds the right to vote, on July 4, 1971, at the White House. (Charles Tasnadi/AP)
Holly N.S. White is assistant editor at the Omohundro Institute of Early American History and Culture and author of "Negotiating American Youth: Age, Law, and Culture in the Early Nineteenth Century" (forthcoming, University of Virginia Press).

Does it make sense to tie rights and privileges — to vote, to marry, to drink — to age?

Age is a biological reality. Science has shown us that certain growth, be it skeletal, reproductive or mental, occurs around certain ages. For example, carpal bones in the wrist begin to overlap at the age of 5 for girls and 7 for boys; most women complete puberty by 14 and men by 16; wisdom teeth show up by 21; and our brains reach their full maturity at 25.

But environmental factors, nutrition and psychological traumas have been shown to stunt or accelerate a person’s physical and mental development. And historians, alongside sociologists and psychologists, have consistently observed that the significance of a person’s age is a social and cultural construct informed by race, gender, class, religion and geographical affiliation.

So why, despite an abundance of evidence showing that it is unwise to use age to define maturity, does our age-based legal system persist?

Because assumptions about age are a legal pillar of American society reaching back almost 250 years to our nation’s founding. Age laws were a tool to advance ideas about equality and fairness, and they persist because, on the surface, they seem to work well. It’s only when we stop to consider the science behind age, or how race, class and gender intersect with the application of these laws, that we see how troubling their existence is in 2019.

At age 18, Americans can claim legal independence from their parents or guardians. They can vote in federal elections at 18 and drink alcohol at 21. These legal age laws operate as rites of passage for American youths as they become adults. But these laws reflect 18th-century definitions of maturity and age, rather than scientific understandings about how age and growth actually work.

Americans’ earliest legal-age-related concerns revolved around when a person should be permitted by law to marry, consent to contract, vote, and testify in court, as well as when they became culpable for crimes. Using age to set these boundaries was an explicit rejection of the British system, in which inherited status governed privileges. Americans wanted to build a system governed by experience and informed consent, one with seemingly neutral markers for attaining the rights of citizenship. While not everyone could acquire land or an education, theoretically, everyone could reach the age of 18 or 21.

This system would have achieved such equality had it only applied to white men, the people whose rights were paramount in the minds of the framers of these new laws. But when applied to the entire population, it fell woefully short. The rigid racial and gender hierarchies that prevailed for much of U.S. history interacted with this system in complicated ways that to this day result in racially biased juvenile criminal sentencing and an outrageously high number of girls marrying before they reach adulthood, to give just two examples.

During the 19th century, new ideas and scientific understandings about childhood and its fundamental differences from adulthood emerged. By the early 20th century, states began to turn these ideas into new laws meant to shield children from the growing demands of an industrial society. Laws regulating workplace hours and conditions, requiring children to attend school and protecting them from abuse and neglect all shifted the responsibility of children’s socialization and safety from the family to the state. These evolved understandings also resulted in the formation of new institutions like juvenile courts.

Unlike early U.S. age laws, which created benchmarks that, on their face, provided a uniform standard of legal maturity, these laws were about social control.

But these progressive-era laws reveal how little the age system actually produced uniformity. Each state differed in how they defined expectations of age, maturity and rights. For example, in the 1920s, both Rhode Island and New York set new minimum age limits for marriage. But while Rhode Island banned marriage before 21, New York chose 14 as a legal minimum, which remained the state’s law until 2017.

These reformers, like the Founders’ generation, continued to turn a blind eye to the fact that legal age laws only provided the perception of equality and fairness. Within the court system, age continued to intersect with social, racial and gender hierarchies to reinforce inequalities.

And this trend of turning a blind eye to the reality of age laws continues today, even as science has proved that their foundation — that at 18 we can assume people are sufficiently developed to make reasoned decisions and offer informed consent — is wrong. The prefrontal cortex, the part of our brains that aids in decision-making, is not done developing until 25, which means we’re giving people the right to make decisions based on the expectation of full cognitive maturity seven years too early — and in some cases many more than that.

But this advancement in scientific understanding has done little to dislodge the hodgepodge of state laws granting rights, privileges and responsibilities at ages ranging from 8 to 21. For example, 33 states currently have no minimum age of criminal responsibility. In other words, in most states a 5-year-old or a 10-year-old (or even a baby) could be legally charged as an adult if a judge so desired. For a federal crime, the legal age of criminal responsibility is 11 years old. While in many states a 12-year-old cannot be legally left at home alone, such a child could be charged as an adult with a federal crime.

And race plays a big factor in who experiences these out-of-date standards: It is Latino, Native American and African American youths who are disproportionately tried as adults, while white youths receive much lighter sentences.

Antiquated age laws also have a disproportionately negative impact on younger girls. In 2017, Frontline released a report on child marriage, revealing that between 2000 and 2015, in the 41 states surveyed, more than 200,000 minors married in the United States, the youngest of whom was 12, and the majority of whom were female. The report sparked public outrage, as many Americans learned for the first time how shockingly out of date our age laws are. In response, many states, including New York, updated their marriage laws to better reflect modern sensibilities of childhood.

Unfortunately, employing modern sensibilities did not mean using the best scientific knowledge to set new standards. If we know a person’s brain does not fully mature until age 25, even new consent laws that raise the age for marriage, as Arkansas recently did from 16 to 17, do little to protect youths from predators and poor decisions.

Age is not a perfect qualifier of ability or maturity. But if we’re going to continue to use it to define law, then we need to try to standardize our measurement of psychological maturity at the federal level and sync it with the best science of development. Instead of asking at what age should someone be allowed to drive or drink or babysit, we need to think: At the age of 12, what should we be expecting young Americans to be doing? If it’s not getting married or being held to the same standard for behavior as an adult, we need to eliminate those possibilities.