Facebook Hires 3,000 To Spot Self-Harm Videos
1 min readFacebook is to hire 3,000 additional moderators to help detect hate speech, child exploitation and self-harm being broadcast on the social network.
Chief executive Mark Zuckerberg said it had been “heartbreaking” to see people “hurting themselves and others” in videos streamed live on Facebook.
He added he would make reporting problematic videos easier.
The move follows cases of murder and suicide being broadcast live on the social network.
In April, a man was killed in a video streamed live on Facebook. Later in the same month, a Thai man killed his baby daughter and then himself in a live stream.
Mr Zuckerberg said the additional staff, joining the 4,500 existing moderators, would help the company respond more quickly when content was reported.
In a post on his Facebook profile, he said the company would develop new tools to manage the millions of content reports it received every week.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” he said.
The post suggested Facebook’s moderators would contact law enforcement, rather than contacting members directly if they were at risk of harm.
“Just last week, we got a report that someone on Live
was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself”, said Mr Zuckerberg.