To prove it, University of Chicago Booth School of Business economists Marianne Bertrand and Emir Kamenica taught machines to guess a person’s income, political ideology, race, education and gender based on either their media habits, their consumer behavior, their social and political beliefs, and even how they spent their time. Their results were released in a new working paper from the National Bureau of Economic Research.
The duo trained their algorithms to detect patterns in decades of responses to three long-running surveys, each with between 669 and 22,033 responses per year. The surveys were tuned and filtered to be consistent over time, which allowed Bertrand and Kamenica to measure how America’s cultural divides have evolved.
To determine how accurately cultural factors predicted a person’s race, education or income tier, the duo tested their algorithms on subsets of the data that the programs had never seen. To keep it fair, they omitted variables that would have been a dead giveaway — if they were predicting whether someone was liberal or conservative, for example, they wouldn’t allow the algorithms to consider the answer to “Which political party do you support?”
Nevertheless, some results are obvious, which indirectly proves that their approach can detect tangible divides. Spending predicts gender with almost perfect accuracy, for example, because men don’t buy nearly as much mascara as women do, and women buy much less aftershave/cologne than men do. But others are revelatory: White people and black people are almost as different in their spending habits as rich people and poor people are, for example.
Differences in social attitudes between liberals and conservatives have been widened over time, Bertrand and Kamenica found. The gap in social attitudes between whites and nonwhites has fallen slightly, but the difference in consumer behavior between races has grown.
In the world of television in 2016, some of the top-ten predictors of whiteness were watching “Rudolph the Red-Nosed Reindeer,” “American Pickers,” “The Big Bang Theory” and the Kentucky Derby. If we’re looking at specific brand names, the top 10 included Thomas’ English muffins, Sweet Baby Ray’s barbecue sauce and Stove Top stuffing.
More generally, in consumer products, the best predictor of whiteness was whether someone owned a pet — followed closely by whether they owned a flashlight. Many of the differences appear to be correlated with wealth and homeownership, areas in which America suffers from vast racial disparities.
The Federal Reserve has found that the median net worth of a white household in 2016 was 9.7 times greater than that of a black one.
Each analysis is binary, meaning that although the authors frame everything in terms of predicting whether someone is white, or high income, or male, the direct opposite is equally true. In other words, “doesn’t own a pet” predicts that someone isn’t white just as strongly as “owns a pet” predicts that someone is white.
To maintain statistical integrity, the authors were able to break the population into only two categories, “white” and “nonwhite,” which may hide differences across a large and diverse population.
Within the surveys they analyzed, social attitudes and media habits were almost as closely linked to race as consumer behavior was. The racial differences in social attitudes were particularly notable.
On other issues, the gap has closed. “In 1976,” the authors write, “one could correctly predict race based on views towards government spending 74 percent of the time but by 2016 this number was down to 56 percent.”
Attitudes toward police violence are only a few percentage points less effective in predicting high (in the top 25 percent) income than they are in predicting whiteness. The overlap shows how closely related race and income are, probably because of historical disparities and continuing problems with racial bias.
Race aside, consumer behavior is strongly linked to income level. In 1992, Grey Poupon mustard predicted income better than any other brand. By 2016, its place as the key signifier of the country’s economic and cultural divide had been taken by Apple’s iPhone — which the researchers found to be a much clearer signifier of income than the condiment had been.
Because of limitations in the media and consumer components of the survey they used, researchers couldn’t get reliable data on the liberal-conservative split that was more recent than 2009. But differences up to that time include some of the most interesting findings in the survey.
They start with superficial differences: If someone went to Arby’s or Applebee’s or used Jif peanut butter, you might guess they were conservative. If they didn’t own fishing gear or use ranch dressing, but drank alcohol and bought novels? Probably a liberal.
The researchers find that, across almost every dimension, America’s cultural divide has remained constant. Yes, high-income households buy different things from low-income ones, and white Americans and black Americans watch different television programs and movies. We’re divided. But we always have been and, despite popular narratives to the contrary, it’s not getting worse.
“What’s really striking to me,” Kamenica said, “is how constant cultural divisions have been as the world has changed.”
But there’s one exception. And it’s a big one. The ideological difference between conservatives and liberals is wide and growing.
“This is not a new phenomenon,” Kamenica said. “For the past 40 years, liberals and conservatives are disagreeing more each year. On every topic, liberals and conservatives are disagreeing more than they used to.”
And their analysis of television-watching habits indicates the nature of America’s media divide may be changing, even if its size isn’t. In 2001, you could predict that someone was conservative if he or she hadn’t watched the Academy Awards or “Will and Grace.” By 2009, those cultural signifiers had been replaced by three major Fox News programs: “The O’Reilly Factor,” “Fox and Friends” and “Hannity & Colmes.”
According to Bertrand, cultural factors such as television and movies matter because of how they enable (or disable) conversation and exchange between neighbors of different backgrounds and viewpoints.
“We feel this sharing of culture is important,” Bertrand said. “There is a lot of focus in economics on human capital and social capital. We must also think about cultural capital and the importance it has in our ability to get along.”
The researchers' machine-learning approach combined elastic net, regression tree and random forest analysis. As input, they used specific responses to three major surveys:
- The American Heritage Time Use Study, now conducted by the Labor Department, asked between 669 and 10,210 U.S. residents per year about the time they spend on activities such as sleeping, working, yardwork and video games.
- The General Social Survey, conducted by NORC at the University of Chicago, polls between 1,093 and 3,735 people per year on their attitudes toward myriad social issues such as trusting others, marijuana legalization, and approving of police striking a man.
- Mediamark Research Intelligence queries between 15,352 and 22,033 adults a year about thousands of consumer behaviors, such as owning a dishwasher, reading “Architectural Digest” and watching the “Teenage Mutant Ninja Turtles” movie.
Correction: A previous version of this story misstated the upper end of the range of annual responses in the surveys analyzed by Bertrand and Kamenica.
Dig Deeper: Personal tech + Privacy
Want to learn about how to keep your personal information private? Check out our curated list of stories below.
Americans receive more than 5.2 billion automated calls in a month. But there are new apps to help stem the deluge.
As tech columnist Geoffrey Fowler slept, a dozen marketing companies used his iPhone to learn his number, email, location and IP address.
Changing privacy default settings means you’ll get less personalization from some services, but it can slow down the number of eerie on-the-nose ads driven by data siphoned by major companies.