How Artificial Intelligence And Big Data Can Be Used To Out Queer People

· Updated on May 28, 2018

Queer people have found refuge in safe online spaces since the early days of the internet.

Invaluable support and often life-saving advice has long been available to even those in the most conservative and isolated parts of the world thanks to strong online LGBTQ communities, but as the pace of technological progress accelerates, there are those who are set to capitalize on the worst aspects of this innovation.

Rudimentary “gaydar” devices are nothing new. A 2011 Android app asked twenty stereotypical questions like “Does your son like to dress well and pay close attention to his outfits and brands?” to guess sexual orientation

But tools being developed now are much more sophisticated and simply work better.

From a French spyware company offering a product that can “find out if your son is gay” to researchers at Stanford University developing an artificial intelligence solution which is able to identify a person’s sexuality with very high levels of accuracy based on just a single photo; outing by machine is far from an abstract fear.

Stanford’s Michal Kosinski and Yilun Wang authored a controversial study titled “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images” that is set to be published in the Journal of Personality and Social Psychology. It claims that when a pair of images, one of a gay person and one of a straight person, are presented to a computer algorithm, it was able to correctly state the subject’s sexuality 91 percent of the time for men and 83 percent of the time for women.

LGBTQ advocacy groups GLAAD and HRC quickly denounced the study and said that artificial intelligence technology that can reliably identify a person’s sexual orientation has yet to be seen. There are a number of limitations in the study, some of which the study’s authors agree are legitimate, including using only white people who self-reported their sexuality, but Kosinski and Wang stand by their findings that facial recognition technology can reveal sexual orientation.

Critics like Jim Halloran, Chief Digital Officer at GLAAD.

“The misleading claims by some researchers are not only sensationalized, they are potentially harmful to LGBTQ people, especially LGBTQ people who aren’t out,” Halloran tells INTO, “or live in places where their identities are criminalized, and who rely on technology as a lifeline.”

Halloran believes that artificial intelligence solutions that expose LGBTQ communities to harm shouldn’t be at the top of the research agenda.

“Why not use the amazing power of technology to solve real-world problems like homophobia and transphobia? We encourage the institutions supporting this type of research to revisit their ethical standards,” he adds.

It should be noted that Kosinski and Wang didn’t develop any new technology or data manipulation methods, instead, using widely available off-the-shelf tools and publicly available data. They contest that both policymakers and LGBTQ communities should be aware of the risks they are facing from computer vision algorithm tools and have the opportunity to take preventive measures.

The near-future consequences of these technologies are clear, but horrifying all the same.

What if Chechnya’s notoriously anti-gay government decides to create its own “gaydar” to find and punish LGBTQ people? Or employers begin to screen out LGBTQ candidates from jobs?

“Given the unwarranted trust we place in technology I can imagine governments in places like Chechnya or North Carolina trying to use AI to detect gay people,” says Greggor Mattson, an associate professor of sociology at Oberlin College.

These eventualities are becoming increasingly possible. Kuwait’s Director of Public Health Yousuf Mindkar said in 2013: “We will take stricter measures that will help us detect gays who will then be barred from entering Kuwait or any of the Gulf Cooperation Countries member states.”

Raids targeting LGBTQ people living in Kuwait are nothing new with a recent moral crackdown on homosexuality resulting in 76 men being deported.

Although the idea of enhanced checks to detect gays was later abandoned, due in some part to a backlash from human rights groups which called for the cancellation of the 2022 World Cup in Qatar, the idea that a country could even consider introducing this sort of test shows how dangerous a workable “gaydar” would be.

“Existing research shows that the technology can effectively predict self-reportedsexual orientation. I wouldn’t be surprised if the technology got better at predicting over time,” says Jenny Davis, Lecturer at TheAustralian National University School of Sociology.

However, Davis argues that the idea of predicting sexual orientation is based on a faulty-premise that sexuality is constant and rigid.

“Sexuality researchers have known for a long time that sexuality is fluid and that sexualitychanges within individuals over the life course,” she tells INTO. “Those who have built ‘detection’ technologies tout themas a way of capturing someone’s essence through technology.”

“Instead, the technology merelyreads someone’s momentary performance,” she adds.

Unscrupulous regimes are unlikely to care about the intricacies of human sexuality and can be expected to use these technologies to out any non-heterosexual or non-heteronormative person. But there are even more damaging methods already being used to detect sexuality.

According to a 2016 Human Rights Watch report, at least eight countries still perform extremely invasive forced anal examinations on men who are suspected to be gay. These so-called “gay tests” usually involve doctors inserting either their fingers or other objects into a suspect’s anus, with the science behind these tests long-discredited.

Irrespective of the actual accuracy of the technology, the end result of any “gaydar” solution will be the unwarranted outing of at-risk LGBTQ people. “Regardless of the technologies’ flaws, a mechanism that assigns labels can and will be used as an objective measure. This is problematic at its base, but especially troubling when applied by authoritarian regimes,” says Davis.

LGBTQ rights groups need to be aware that there is an appetite for artificial intelligence-based outing methods, no matter the moral objections to this technology. Kosinski and Wang’s findings may be incomplete but they are a clear sign that artificially intelligent “gaydar” is far from a fictitious science-fiction concept.

Mattson, who argued in a recent blog post that Kosinski and Wang’s “stereotypes about men and women profoundly taint their research, as does their confusion of sex, gender, sexuality, and cultural practices,” makes the point that some members of the LGBTQ community are already disproportionately victimized for their sexual expression, without the need for AI technologies.

“We don’t need technology to detect those of us who are out and proud or who are gender non-binary.” Mattson says. “People who don’t fit in are already targeted, and technology is not going to make that worse: bigotry does, and there’s no technological solution for prejudice.”

Don't forget to share:
Read More in Culture
The Latest on INTO