Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Facebook Is Testing A Downvote System For Flagging Comments

Facebook is currently testing a feature that allows users to "downvote" comments. In a statement to TechCrunch, Facebook said that the downvote button is being offered in a limited number of public Page post comments and can be seen by a small set of people in the U.S. only. The downvote test is currently running for only 5% of Android users with English set as the default language. And it will not appear in Groups posts or the Pages of public figures.


Facebook said that the reason for the downvote button is to make it easier for users to provide a signal that a comment is “inappropriate, uncivil, or misleading.” And Facebook pointed out that this is not a “dislike” button. This is simply a measure for collecting responses and feedback for comments on public Pages.

After tapping on the downvote feature, users are asked whether the comment is “Offensive,” “Misleading” or “Off Topic.” The downvote test is short-term and will not affect the ranking of the comment, the post or the Page. The feature does not publicly display how many downvotes a post received as it is just a way for Facebook to internally collect feedback.

Facebook already offers a “Hide” button in its comments system for every user already, but it is not as intuitive as the downvote button. Facebook CEO Mark Zuckerberg said in the past that the company does not intend on adding a system where people can vote up or down on posts. So the company launched a set of emoji called “Reactions” in February 2016 instead. The set of emoji Reactions characters include Love, Wow, Haha, Sad and Angry.

Back in December, Facebook also announced it was implementing two changes to fight against fake news. The company said it would no longer use “Disputed Flags” for identifying fake news. Related Articles will be displayed as an alternative in order to give people more context. And Facebook said it was starting a new initiative to better understand how people decide if information is accurate or not based on the news sources they follow. The reason why Facebook removed the “Disputed Flags” is because of academic research on correcting misinformation cited strong imagery, such as red flags, actually entrenches deeply held beliefs — which is the opposite effect that is intended. By studying the feedback received from downvotes, Facebook would be able to better understand what sort of content resonates with its users as it combats fake news.


This post first appeared on Ajakai ICT, please read the originial post: here

Share the post

Facebook Is Testing A Downvote System For Flagging Comments

×

Subscribe to Ajakai Ict

Get updates delivered right to your inbox!

Thank you for your subscription

×