An anonymous reader quotes the New York Times: Riots and lynchings around the world have been linked to misinformation and hate speech on Facebook, which pushes whatever content keeps users on the site longest -- a potentially damaging practice in countries with weak institutions and histories of social instability. Time and again, communal hatreds overrun the newsfeed unchecked as local media are displaced by Facebook and governments find themselves with little leverage over the company. Some users, energized by hate speech and misinformation, plot real-world attacks. A reconstruction of Sri Lanka's descent into violence, based on interviews with officials, victims and ordinary users caught up in online anger, found that Facebook's newsfeed played a central role in nearly every step from rumor to killing. Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact... Sri Lankans say they see little evidence of change. And in other countries, as Facebook expands, analysts and activists worry they, too, may see violence. A Facebook spokeswoman countered that "we remove such content as soon as we're made aware of it," and said they're now trying to expand those teams and investing in "technology and local language expertise to help us swiftly remove hate content." But one anti-hate group told the Times that Facebook's reporting tools are too slow and ineffective. "Though they and government officials had repeatedly asked Facebook to establish direct lines, the company had insisted this tool would be sufficient, they said. But nearly every report got the same response: the content did not violate Facebook's standards."
Read more of this story at Slashdot.