Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

Guardian exclusive: AI tools rate photos of women as more Sexually Suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved

This story was produced in partnership with the Pulitzer Center’s AI Accountability Network

Images posted on Social Media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies.

These AI tools, developed by large technology companies, including Google and Microsoft, are meant to protect users by identifying violent or pornographic visuals so that social media companies can block it before anyone sees it. The companies claim that their AI tools can also detect “raciness” or how sexually suggestive an image is. With this classification, platforms – including Instagram and LinkedIn – may suppress contentious imagery.

Continue reading…



This post first appeared on Apache HTTP Server Test Page Powered By CentOS, please read the originial post: here

Share the post

‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

×

Subscribe to Apache Http Server Test Page Powered By Centos

Get updates delivered right to your inbox!

Thank you for your subscription

×