This article exists as part of the online archive for HuffPost Australia, which closed in 2021.

Queer Groups Condemn Study Claiming Computers Can Tell If You're Gay From Photos

Queer Groups Condemn Study Claiming Computers Can Tell If You're Gay From Photos
A new study used “deep neural networks to extract features” from more than 35,000 facial images that appeared to determine a person's sexuality with remarkable accuracy.
Petrovich9 via Getty Images
A new study used “deep neural networks to extract features” from more than 35,000 facial images that appeared to determine a person's sexuality with remarkable accuracy.
A new study used “deep neural networks to extract features” from more than 35,000 facial images that appeared to determine a person's sexuality with remarkable accuracy.
Petrovich9 via Getty Images
A new study used “deep neural networks to extract features” from more than 35,000 facial images that appeared to determine a person's sexuality with remarkable accuracy.

LGBTQ advocacy groups have denounced a “dangerous and flawed” Stanford University study that claims to have used artificial intelligence to help determine a person’s sexuality by looking at their face with remarkable accuracy.

The study, which is in draft form but has been accepted by the Journal of Personality and Social Psychology, reportedly used “deep neural networks to extract features” from more than 35,000 facial images that men and women had posted on an unidentified dating website.

Researchers Michal Kosinski and Yilun Wang said they used an algorithm that could correctly identify gay men 81 percent of the time, The Economist reported Friday. Similarly, they claimed the tool was accurate for 74 percent of the women it tested.

It’s unclear what benefits, if any, the research would provide. Instead, Kosinski and Wang said their work was a “preventative measure” that highlights the safety risks many LGBTQ people could face if such technology becomes widely available. In keeping with that mindset, they chose not to disclose the dating website used in their research in an effort to discourage copycats.

“Tech companies and government agencies are well aware of the potential of computer vision algorithm tools,” the men wrote. “In some cases, losing the privacy of one’s sexual orientation can be life-threatening. The members of the LGBTQ community still suffer physical and psychological abuse at the hands of governments, neighbors, and even their own families.”

The research, which was first revealed by The Economist on Friday, made global headlines, with Newsweek, The Guardian, MIT Technology Review and other publications each running their own takes. Officials at GLAAD and the Human Rights Campaign (HRC) quickly condemned the research, noting that the study relied on “myriad flaws” and “had not been peer reviewed.”

The report, these officials noted, took online information at face value, and also hinged on the assumption that one’s sexuality did not differ at all from the sexual activity they choose to engage in. They also pointed out that the study did not look at any non-white individuals.

Jim Halloran, who is GLAAD’s Chief Digital Officer, said the technology used in Kosinski and Wang’s study had simply shown that “a small subset of out white gay and lesbian people on dating sites” look similar.

“This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites,” he said. “At a time where minority groups are being targeted, these reckless findings could serve as weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous.”

Equally troubled by the media’s attention to the report was Ashland Johnson, who is the HRC’s Director of Public Education and Research. The “dangerously bad” research was “likely to be taken out of context,” and could potentially “threaten the safety and privacy of LGBTQ and non-LGBTQ people alike,” he said.

“Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay,” he said. “Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world ― and this case, millions of people’s lives ― worse and less safe than before.”

Read more on the study here.

Catch the latest in LGBTQ news by subscribing to the Queer Voices newsletter.

Close
This article exists as part of the online archive for HuffPost Australia. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.