In a bid to prevent incidences that see content that promote crime and violence run unchecked on its platform, Facebook is adding 3000 new members to its operations team. The new members will work to screen content that consists of harmful and videos that are against Facebook’s policies.
The new members are in addition to the 4,500 strong team that is already working on the issue. While we do not know if these new employees are full time employees or contractors, what we do know is that they will be a significant addition to the strength that is working to hold the problem in check.
Speaking on the topic, Facebook CEO Mark Zuckerberg said:
If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.
This also suggests that Facebook is looking own multiple roads in its attempt to curb such content. The past couple of months have been spent in investing in technology and making it Easier for people to report when they see something that that is unhealthy for the community. For instance, it is now easier for people to seek help if they fear someone has suicidal tendencies, or even if they themselves are afflicted.
However, despite all this, there have been instances where there was a lareg time gap between when some post was reported and when it was taken down. Obviously, the company has yet to perfect the algorithms it has put on the job. And that is why it is bringing more manpower to bear.
We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.