Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes

An anonymous reader quotes Gizmodo: London's Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next "two to three years." But, in its current state, the system can't tell the difference between a photo of a desert and a photo of a naked body... "Sometimes it comes up with a desert and it thinks its an indecent image or pornography," Mark Stokes, the department's head of digital and electronics forensics, recently told The Telegraph. "For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour." The article concludes that the London police software "has yet to prove that it can successfully differentiate the human body from arid landscapes."

Read more of this story at Slashdot.



This post first appeared on Werbung Austria - Slashdot, please read the originial post: here

Share the post

UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes

×

Subscribe to Werbung Austria - Slashdot

Get updates delivered right to your inbox!

Thank you for your subscription

×