In the wake of the gruesome homicide last month when a man dubbed the "Facebook Live killer" selected a random stranger to kill on video and posted the graphic video on Facebook, the social media giant has announced its plans to combat a repeat occurrence.
Facebook co-founder and CEO Mark Zuckerberg wrote on Facebook Wednesday that the company plans to hire 3,000 more content moderators to its operations team to perform tasks such as monitoring live videos and quickly responding to user issues. The company currently has 4,500 people working in that role.
"Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," Zuckerberg said.
"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner — whether that's responding quickly when someone needs help or taking a post down," he wrote. "Over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly."
Zuckerberg also noted that Facebook plans to make it simpler for users to report issues so that the entire process of reporting a problem and being reviewed is a seamless and quick turnaround:
In addition to investing in more people, we're also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
Zuckerberg also announced Facebook's intentions to more efficiently curb what he called "hate speech," along with other violations of their terms of service.
Facebook recently added a suicide prevention tool to its live video feature after a 14-year-old girl in Miami live-streamed her suicide in January. The tool gives users the ability to report someone who is live broadcasting while suicidal.