ESC

Psychologist Claims to Have Invented Gaydar Using AI

Michal Kosinski is a psychiatrist who works in the field of psychographic profiling. Basically he builds profiles of types of people and can identify which profile people likely fit into based on certain traits. Psychographics are mostly used in everything from designing board games to targeting political advertising, and Kosinski’s work is basically what Cambridge Analytica used to target Facebook ads for the 2016 presidential election.

Now, Kosinski has applied his psychographic expertise to facial recognition, and he says that AI can now determine if you’re gay or straight simply by looking at you. The synopsis of the paper he published on the AI explains how it works, which is really fascinating.

We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain. We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males with 57% accuracy and gay females with 58% accuracy. Those findings advance our understanding of the origins of sexual orientation and the limits of human perception. Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.

See, my guess would have been it simply looked at whether or not someone had a moustache without a beard, because that’s how I do it.

So this sounds a bit like phrenology or physiognomy, early pseudoscientific fields that purported to tell someone’s personality by skull shape or brow ridge and the like. In an interview with The Guardian, Kosinski dismissed those fields as being “racism disguised as science,” but said that links between the face and personality may exist, and with machine learning instead of biased human input they can be determined, which I’m skeptical of, but a 91% success rate isn’t bad.

Kosinski said that creating an AI gaydar was never his purpose, but that he wanted to warn people that it might be possible and it could be something a bad actor could be developing, saying “This is the inherent paradox of warning people against potentially dangerous technology. I stumbled upon those results, and I was actually close to putting them in a drawer and not publishing – because I had a very good life without this paper being out. But then a colleague asked me if I would be able to look myself in the mirror if, one day, a company or a government deployed a similar technique to hurt people.”

Of course, there’s also the possibility that the AI isn’t using your face to determine whether you’re straight or gay at all. Kosinski’s critics say that the computer could be analyzing any number of non-inherent factors in the photographs to make a determination.

One vocal critic of that defence is the Princeton professor Alexander Todorov, who has conducted some of the most widely cited research into faces and psychology. He argues that Kosinski’s methods are deeply flawed: the patterns picked up by algorithms comparing thousands of photographs may have little to do with facial characteristics. In a mocking critique posted online, Todorov and two AI researchers at Google argued that Kosinski’s algorithm could have been responding to patterns in people’s makeup, beards or glasses, even the angle they held the camera at. Self-posted photos on dating websites, Todorov points out, project a number of non-facial clues.

So maybe your face isn’t telling as much about your sexual orientation as the way you take the photo of it is. Or maybe the fact that you posted it on Grindr is a good indication that you’re gay. I’m not sure I believe Kosinski’s finding; Todorov’s explanations are very persuasive. But it’s nice to know we can use AI for things other than putting Maisie Williams’s into porn movies.

Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments