JD Schramm is the MBA Class of 1978 Lecturer in Organizational Behavior at Stanford’s Graduate School of Business.

Computers are more accurate than humans at visually detecting sexual orientation, according to an article in this month’s Journal of Personality and Social Psychology. That’s not really news to me — I’ve always been bad with “gaydar.” I was once on a third date with a guy before I knew he was gay. Okay, I did not realize our first two coffees were actually dates in his mind, and the friends who introduced us failed to tell me it was a romantic setup. But the point’s the same; I’ve not been gifted with a strong gaydar and would welcome any help I can get.

But the implications of this research are much more serious than my romantic musings. In the wrong hands, this technology, if capable of doing what my Stanford colleagues Michal Kosinski and Yilun Wang warn it can, could be a very real threat to not just the privacy but also the safety of the gay community.

AD
AD

Forty years ago, gay rights pioneer Harvey Milk called for every gay person to come out; this is much easier today, with same-sex marriage (many of us now call it just “marriage”), open service in the military, welcoming churches, and affinity groups at hundreds of thousands of employers. Even so, many of my LGBTQ peers choose to remain hidden; in fact, the Human Rights Campaign estimates the percentage of closeted LGBTQ workers at 53 percent.

This may soon change — and possibly without the consent, or even the knowledge, of gays and lesbians. Researchers Kosinski and Wang report that advances in facial recognition programs make it possible for a computer to distinguish between photographs of gay and straight men in 81 percent of cases and between straight women and lesbians in 71 percent. When presented with five images of the same person, this accuracy increased to 91 percent and 83 percent, respectively. By comparison, human judges were right only 61 percent of the time for men and 54 percent for women.

When Kosinski and Wang initially offered this warning last fall, their research and conclusions were fiercely criticized by LGBTQ organizations that I respect. I watched the debates online with careful attention and began to engage with the frightening implications of their findings.

AD
AD

The explosive growth of public and private surveillance camera installations over the past several years means that our image is captured all the time, often without our knowledge. In recent months, we’ve learned of breaches of personal data at the major credit reporting agencies, Yahoo and our own university. So much of our lives exists in data files held on servers more vulnerable than we like to believe. It should come as no shock that others could piece together anyone’s sexual identity (or a compelling case for one) from a variety of sources.

Kosinski and Wang’s research looked only at gays and lesbians. But I fear the greater risk may be to the most vulnerable in our community: the transgender individuals who may well carry identifiable facial features from the gender they were assigned at birth — and who are already at high risk for hate crimes.

To be sure, there are clear limitations to the research. Kosinski openly admits that the sample is small and leans heavily toward white males on dating websites, does not delve into other dimensions of sexuality such as bisexuality, and is a first effort at this sort of analysis. However, that does not mean the work should be ignored, as some have suggested: Acting as if the information does not exist does not diminish its threat.

AD
AD

So assuming the worst, what can be done? I cannot imagine what level of technological intervention or legislation could prevent the type of “cyber-outing” the researchers predict. Rather, I think our best response is to reach back to Milk’s challenge to the gay community — and to try to make the world more accepting of people who choose to come out. Let’s embrace the important work of minimizing or even eliminating the risks of being outed. If just over half of LGBTQ individuals are unwilling to come out at work, there’s clearly room for all of us to do more than we are.

Now more than ever, we need to further the crucial work of establishing fair and equitable housing protection, job opportunities and protection from discrimination. We must fight against all forms of injustice facing those who are marginalized. The advances in AI and machine learning make it increasingly difficult to hide such intimate traits as sexual orientation, political and religious affiliations, and even intelligence level. The post-privacy future Kosinski examines in his research is upon us. Never has the work of eliminating discrimination been so urgent.

Read more:

AD
AD