To say that Facebook has some egg on its face right now would be an understatement. The social network not only didn’t take down some sexualized images of children, but reported the BBC when it drew these images to its attention. However, the company now says it has turned a corner. Facebook’s Simon Milner tells the UK’s Home Affairs Committee that the incident showed the company’s moderation system “was not working.” The offending photos have since been taken down, he says, adding that the process should be fixed.
It’s not clear just what a fix entails, or just how much of an improvement Facebook made. The internet giant has been accused of simultaneously under- and overreacting to content issues, either by leaving it up despite known abuse or taking down material that’s not at all controversial. While it would be difficult or impossible for Facebook to catch absolutely every violation, it’ll have to show make incidents like the BBC investigation a thing of the past.
If there’s any consolation for Facebook, it’s that it isn’t the only one on the hot seat. Also grilled was Google (specifically, YouTube) and Twitter over their own troubles fighting online hate speech. Both admitted that they had to do more to keep hate off their services. Google wasn’t specific about its solutions, but Twitter acknowledged that it had to be more communicative when users file abuse reports. These kinds of issues are relatively common among internet giants, in other words — it’s just that Facebook’s latest crisis was more embarrassing than most.
The post Facebook Admits and blames “moderation system” over sexualized images of children appeared first on Innovation Village.