Researchers from the nonpartisan Pew Research Center used a carefully calibrated and monitored artificial-intelligence algorithm to calculate gender representation in the top Google image search results for more than 100 occupations. To compare online representation to a job’s real-world gender breakdown, they compared results to Labor Department statistics.
In addition to overall representation, the researchers found images of men typically ranked higher in search results. The researchers generally encountered the first picture of a man as the second image on the page. The first woman frequently wouldn’t show up until the fourth image.
Some of the jobs where women are underrepresented in images appeared to reflect stereotypes about masculinity, intimidation and force: Most bill collectors, probation officers and bartenders are women, yet Google search results paint an overwhelmingly male picture of each occupation.
Google also significantly underrepresents women as managers and chief executives. Only 28 percent of chief executives are women, but the share of women who appear as images in the top 100 Google results is even lower, just 10 percent. For general managers, those figures are 34 percent of the labor force and 15 percent of Google image searches.
That’s most troubling for what it says about “how people perceive women’s role in society,” said Haley Swenson, a fellow at New America, a left-leaning think tank.
“We know that what people see affects what they perceive to be normal, and that in turn affects behavior,” Swenson said.
We can’t bust open Google’s black box and dissect their ranking system, but we can assume image search results broadly reflect supply and demand — supply because the search results represent the images media, corporations and users choose to represent professions, and demand because Google prioritizes images users link to and share.
“When we learn more about how gender is being portrayed in that new information landscape, it might give us some understanding about how this could be shaping public perceptions of women’s roles in the workplace,” said Kim Parker, Pew’s social trends research director.
Meanwhile, search results overrepresent images of women in a minority of occupations.
Many are traditionally female occupations, such as singer, model and flight attendant, but image searches also return an unusually high share of women in male-dominated occupations, such as mechanic, computer programmer and police officer. It may be driven by the laudable impulse to signal women are equally able to perform such jobs, Swenson said, but that, too, can create problems.
“In an effort to showcase diversity, people might be underestimating how sharp the divides are in the workplace,” she said, adding that Google might lead people to underestimate the extent of “occupational segregation” in the United States.
Google image searches and media coverage might be the only preview of a job for many underrepresented candidates, since they’re less likely to know women employed in these fields.
Research supports the idea that skewed representation can shape whether the next generation can see themselves doing a particular job.
“How young people choose their career — a lot of that is shaped by perceived norms,” Swenson said. “If you don’t think it’s normal for a person like you to do a certain job, then you just don’t do that.”
Lopsided representation also affects the people doing those jobs. If people see their jobs portrayed as overwhelmingly male, it reinforces the idea women don’t belong there and emboldens discrimination and harassment.
“In heavily male-dominated workplaces, one of the things that drives sexual harassment is men in those jobs perceive women as infringing in a space that’s supposed to be theirs,” Swenson said.
An earlier Pew survey found 32 percent of women who work in mostly female workplaces complain of sexual harassment. In mostly male workplaces, about 49 percent say there’s a problem.
Pew researchers Onyi Lam, Brian Broderick, Stefan Wojcik and Adam Hughes began by compiling a list of more than 200 occupations. They narrowed it down to occupations with more than 100,000 workers nationwide for which the Labor Department released gender breakdowns.
They eliminated professions that didn’t have enough image-search results on Google or enough images of people performing the job. When multiple people were shown performing a job, they analyzed each person separately. When an occupation contained several job names in the description, such as “models and demonstrators,” they treated each as a separate occupation. They also simplified some Labor Department jargon. “Postsecondary teacher” became “professor,” for example.
The 10,000-plus images analyzed were collected between July 7 and Sept. 13, 2018.
Pew trained and tested their algorithm on a separate data set of about 25,000 images classified by gender and other characteristics. Algorithms have been criticized for having been trained primarily on images of people of European ancestry, so Pew said researchers created their own more diverse training image data set. When their algorithm was fed images it hadn’t seen before, it classified them with 95 percent accuracy. It also showed no gender bias.