Site icon Wonderful Engineering

Facebook Is Hiring 3,000 More Employees To Monitor The Network For Violent Videos

facebook live

Source: True Pundit

It has only been a year since Facebook Live was launched. The feature allows live video streaming from anywhere in the world. While it has appeared to be great in many aspects, the overflow of violent content on live video streams has raised many questions. According to the Wall Street Journal, at least 50 criminal incidents have been broadcasted live on Facebook in just one year.

The murder of a child by a father in Thailand, and a pensioner in Cleveland, Ohio was streamed on Facebook Live in April. NSPCC has released a study the results of which suggest that social media companies have a responsibility to keep such content away from children. Facebook CEO Mark Zuckerberg talked about the unfortunate incidents in a blog post saying:

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook – either live or in video posted later.  It’s heartbreaking, and I’ve been reflecting on how we can do better for our community. If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down”

Individually monitoring every post on a platform with billions of users is as complicated as it seems. The company already employees 4500 people to review the reports submitted by users which can reach up to millions in just one week. The employees review every post that has been reported for violation of terms of service. Facebook is adding 3,000 more employees to the list to monitor all content including live videos. These workers will ensure that any content showing murder, assault, suicide, hate speech, and child exploitation is taken down as quickly as possible.

Assistant professor of information studies at the University of California- Los Angeles, Dr. Sarah Roberts says that the employees responsible for looking through distressing content have tough working conditions which could lead to post-traumatic stress disorder (PTSD).

Zuckerberg has repeatedly said that the platform will employ artificial intelligence techniques to scan the platform for pornography, violent content, or any other thing that violates the terms of service. The CEO says it will be many more years before AI achieves that perfect level of violation detection. Increasing the content monitoring staff is just a temporary solution to the problem.

It is not just Facebook that has been under criticism for publishing user content without sufficient moderation. The Michigan State University conducted a survey of 14,000 internet users with a focus on fake news issues, and the results suggest that users were still able to access quality information from numerous sources. The internet studies expert at the University Professor William Dutton said, “These findings should caution governments, business, and the public from over-reacting to alarmist panics.”

Whatever the scenario is, no one deserves to go through the stress of looking at murders, suicide or any violence of the sort. It is pleasant to know that the company is taking immediate measure to handle the situation.

Exit mobile version