Facebook to hire thousands following criticism of controversial content

Facebook to hire thousands following criticism of controversial content
The decision follows the recent broadcast of a murder and suicide by users live, while detractors have previously cited the appearance of terror-related propaganda as a troubling phenomenon.
3 min read
04 May, 2017
The social media giant is set to hire 3,000 additional staff [AFP]
The social networking giant Facebook is set to hire 3,000 people around the world to monitor video content and posts uploaded to the popular website for violent or criminal acts.

The decision follows recent criticism of the organisation, after a murder and a suicide were recently broadcast by users live on the site.

The new employees are set to join a further 4,500 people already employed by Facebook's content moderation force.

"If we're going to build a safe community, we need to respond quickly," Facebook CEO and founder Mark Zuckerberg wrote in a Facebook post on Wednesday.

"We're working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."

The controversy of Facebook Live

In 2016 Facebook made a live streaming option, Facebook Live, available to the social networking giant’s nearly two billion users.

Recordings of a number of controversial, and violent acts carried out by US Police, including the aftermaths of the deaths of Philando Castille, and Keith Scott Lamont, were broadcast through the app.

A disturbing phenomenon of people live-streaming suicide attempts through the app has also emerged.

Previously, critics have also called on Facebook to establish better measures to prevent the propagation of terrorist content on its website.

Last year Facebook announced that it was teaming up with Twitter, YouTube, and Microsoft in order to create a database of digital "fingerprints" designed to identify violent terrorist imagery or recruitment videos.

Preventing terror-related propaganda

In addition to videos related to violent crimes and executions carried out by the Islamic State group, and other terror organisations, in May 2016, a number of media organisations including The Independent reported that Islamic State fighters appeared to be attempting to sell sex slaves on Facebook.

"There is no place for content that promotes terrorism on our hosted consumer services," said a joint statement posted to Facebook's online newsroom, following the announcement in June 2016.

"We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help curb the pressing global issue of terrorist online content."

However, Facebook continues to face criticism that propaganda promoted by Islamic State group supporters, and other terrorist organisations, continues to make its way onto the site.

Notably, in the aftermath of a March 2017 lone-wolf attack targeting the Houses of Parliament, in London's Westminster, posts appeared on the social networking site in praise of perpetrator Khalid Masood's actions.

Facebook has also faced criticism this year from British MPs who have alleged the presence of "illegal and dangerous content" including Neo-Nazi, White Supremacist, and Islamophobic propaganda on its website.