Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

A computer tells your government you’re 91% gay. Now what?

A fascinating and horrifying new AI algorithm is able to predict your sexual orientation with 91% accuracy from five photographs of your face. According to the researchers, the human brain isn’t wired to read this data from a face, but according to these results, it is there, and an AI can detect it. This raises a bigger issue: who will have access to AI in the future, and what will they use it for?

The article in The Guardian is fascinating and scary. It describes new research that is able to predict with 91% accuracy if a man is homosexual, based on just five photographs of the face. Similarly, it has a 83% precision in predicting homosexuality in women. This makes the AI leaps and bounds better than its human counterparts, who got the responses 61% and 54% correct, respectively — more or less a coin toss, useless as a measure. The researchers describe how the human brain apparently isn’t wired to detect signs that are clearly present in the face of an individual, signs that are demonstrably detectable.

Normally, this would just be a curiosity, akin to “computer is able to detect subject’s eye color using camera equipment”. But this particular detection has very special, and severe, repercussions. In too many countries, all of which we consider underdeveloped, this particular eye color — this sexual orientation — happens to be illegal. If you were born this way, you’re criminal. Yes, it’s ridiculous. They don’t care. The punishments go all the way up to the death penalty.

So what happens when a misanthropic ruler finds this AI, and decides to run it against the passport and driver license photo databases?

What happens when the bureaucracy in such a country decides you’re 91% gay, based on an unaccountable machine, regardless of what you think?

This highlights a much bigger problem with AIs than the AIs themselves, namely, what happens when despotic governments gets access to superintelligence. It was discussed briefly on Twitter the other day, in a completely different context:

“Too many worry what Artificial Intelligence — as some independent entity — will do to humankind. Too few worry what people in power will do with Artificial Intelligence.”

Now, having a 91% indicator is not enough to convict somebody in a court of law of this “crime” in a justice system meeting any kind of Reasonable Standard. But it doesn’t have to be a reasonable standard.

If you want an idea of what could happen, well within the realm of horrifying possibility, consider the McCarthyism era in the United States, where anybody remotely suspected of being a communist were shut out from society: denied jobs, denied housing, denied a social context.

What would have happened if a computer of the time, based on some similar inexplicable magic, decided that a small number of people were 91% likely to be communist?

They would not have gotten housing, they would not have gotten jobs, they would lose many if not all friends. All because of some machine determined them to possibly, maybe, maybe not, probably (according to the machine builders), be in a risk group of the time.

We need to start talking about what governments are allowed to do with data like this.

Sadly, the governments which need such a discussion the most, are also the governments will which allow and heed such a discussion the least.

Privacy really remains your own responsibility.

The post A computer tells your government you’re 91% gay. Now what? appeared first on Privacy Online News.



This post first appeared on Privacy News Online, please read the originial post: here

Share the post

A computer tells your government you’re 91% gay. Now what?

×

Subscribe to Privacy News Online

Get updates delivered right to your inbox!

Thank you for your subscription

×