• Home
  • News
  • Coins2Day 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
Commentary

Hiring 3,000 More Workers Won’t Fix Facebook’s Violent Video Crisis

By
Shontavia Johnson
Shontavia Johnson
and
Bethany Cianciolo
Bethany Cianciolo
Down Arrow Button Icon
By
Shontavia Johnson
Shontavia Johnson
and
Bethany Cianciolo
Bethany Cianciolo
Down Arrow Button Icon
May 9, 2017, 1:41 PM ET
Alternative Views at Sao Paulo Fashion Week Fall/Winter 2017
SAO PAULO, BRAZIL - MARCH 16: An alternative view of the audience with cell phones in live facebook and snapchat in Tig fashion show at Sao Paulo Fashion Week N 43 SPFW Fall/Winter 2017 on March 16, 2017 in Sao Paulo, Sao Paulo. (Photo by Mauricio Santana/Getty Images)Mauricio Santana—Getty Images

Last week, Facebook stated that it is hiring 3,000 new people to monitor and remove inappropriate posts like graphic and violent videos. This makes sense, given the many issues Facebook (FB) has faced since making video content its overwhelming priority over the past few years. The company has seen an influx of violent videos featuring murders, suicides, and rapes posted on the site, which has caused both internal struggles regarding how to appropriately address this disturbing trend and external struggles with negative press and public backlash.

Last week’s hiring announcement builds on CEO Mark Zuckerberg’s earlier admissions that the company must be more responsive when inappropriate content is shared on the site. Hiring 3,000 new employees isn’t a bad start, but these future hires will not eliminate Facebook’s video problems. And it isn’t clear yet if anything will.

Setting aside the broader public issue of eliminating societal violence, at the root of Facebook’s problem is its rush to roll out video services as quickly as possible. By many accounts, Facebook was much slower to integrate video into the platform than its social media competitors like YouTube, Periscope, and Snapchat (SNAP). Given that the company started at a competitive disadvantage, tools like Facebook Live were seemingly rushed to the marketplace without serious consideration about the negative impact of such offerings.

In one exclusive interview days before Facebook Live was launched, Zuckerberg touted the product as “a great medium” for “raw and visceral content” created by users, without discussing at all its potential dangers. Another person familiar with Facebook Live’s development told The Wall Street Journal that the company “didn’t grasp the gravity of the medium” during its hurried, two-month rollout process. Given this history, it isn’t surprising that Facebook has appeared unprepared to address the problems of violent videos and other inappropriate content.

In this vein, Facebook’s public commitment to hire more people, coupled with its earlier promises to improve technology and review reporting processes, seem somewhat hollow without more transparency regarding how these measures will be implemented. There are also questions about whether these things will work at all, even if implemented perfectly.

Hiring more people will provide more “eyeballs on the ground,” but given that Facebook’s current user base is comprised of nearly 2 billion people (roughly 25% of all human beings on the planet), 3,000 new hires—in addition to the 4,500 people Facebook currently employs on its community operations team—can only scratch the surface of the billions of videos viewed each day on the platform. While human judgment is certainly helpful in determining which videos violate Facebook’s standards, the company’s sheer magnitude makes it untenable for these new people to completely solve the problem. Even Zuckerberg himself acknowledged the impossibility of the task last week, telling investors, “No matter how many people we have on the team, we’ll never be able to look at everything.”

Compounding Facebook’s violent video problem is the process by which videos get reported to these human beings. Facebook currently relies heavily on users to flag inappropriate content. On one hand, the number of reports can be enormous. On the other hand, some violent videos have gone hours without being reported at all. In addition, reported videos depend heavily on users’ definition of “inappropriate”—some videos may not report community standards at all.

Which, of course, is why Facebook has emphasized its use of improved technology like computer algorithms and artificial intelligence. This technology could at some point have the capacity to identify inappropriate content as predefined by Facebook and remove it from the site. The technology has not, however, reached this point yet, as Facebook learned in August of last year when it fired its entire Trending staff after allegations that some people engaged in political bias when selecting trending items. Instead of relying on humans, Facebook began to use mostly algorithms to select trending items. Within days, fake news stories began trending instead. It is clear that the technology is not currently reliable enough to eliminate the need for human engagement in the process.

Finally, it may be that self-policing is inadequate to accomplish the task of addressing some inappropriate content, like violent and graphic videos. Unlike other communication forms like television and radio, which are regulated by the federal government, Facebook and other social media platforms have been left to largely decide for themselves what communication is appropriate to share publicly. It may be time to rethink the approach of allowing companies like Facebook to decide how to regulate themselves and the content they provide.

Today’s limitations of both human capacity and technological advancement put Facebook in a position where no solution will be perfect. Even so, it has both corporate and ethical obligations to pursue acceptable solutions to its violent video problem. Perhaps the only solution is one that incorporates human judgment, technology, and government intervention.

Shontavia Johnson serves as the Kern Family Chair in Intellectual Property Law and directs the Intellectual Property Law Center at Drake University Law School. She curates content related to law, innovation, and policy.

About the Authors
By Shontavia Johnson
See full bioRight Arrow Button Icon
By Bethany Cianciolo
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Coins2Day 500
  • Global 500
  • Coins2Day 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Coins2Day Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Coins2Day Brand Studio
  • Coins2Day Analytics
  • Coins2Day Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Coins2Day
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Coins2Day Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Coins2Day Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.