Please Log In
Mom of three fosters rescued dogs and is helping to drive the conversation about digital parenting as VP of Consumer Marketing for Content Watch.
July 24, 2012Net Nanny for Android 2.0
May 24, 2017
Facebook’s motto is “Move Faster and Break Things,” but after the leaked moderator policy guidelines they might want to consider “Move Faster and Fix Things.”
Facebook was under enormous political and user pressure in Europe and the US to formulate guidelines and integrate human moderators to monitor online abuse. But is adding 3,000 moderators enough – and are Facebook’s guidelines too general to be effective?
Leaked documents obtained by the Guardian reveal that Facebook receives nearly 54,000 cases of revenge pornography, violence and sextortion on the site in a single month.
Facebook is dependent on users to report most of the abusive content, so the problem could even be on a larger scale. The challenge for moderators, who are already claiming they are overwhelmed by the volume of work, is to interpret the guidelines and make a snap decision on the content complaint in a matter of seconds.
“Facebook cannot keep control of its content,” said one source. “It has grown too big, too quickly.”
Many moderators are said to have concern about inconsistencies of the guidelines. In particular, guidelines on sexual content are viewed as very complex and confusing. The Guardian created the video below to highlight just a few of the guidelines challenging the moderators.
Monika Bickert, Facebook’s head of global policy management, said that Facebook has almost 2 billion users so it was difficult to reach a consensus on what to allow. “We have a really diverse global community and people are going to have very different ideas about what is okay to share. No matter where you draw the line, there are always going to be some grey areas. For instance, the line between satire, humor and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not,” she said.
I recognize the enormous task at hand to moderate user-generated content, but if you create a platform for users to express their views, shouldn’t safeguards to monitor that content be part of Facebook’s growth plan?
Some critics in the US and Europe want Facebook to be regulated in the same ways as mainstream broadcasters and publishers. In a report released this month, British Members of Parliament said, “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal or dangerous content, to implement proper community standards or to keep their users safe.”
Facebook’s response to this criticism was Facebook is “a new kind of company. It’s not a traditional technology company. It’s not a traditional media company. We build technology, and we feel responsible for how it’s used. We don’t write the news that people read on the platform,” said Bickert.
While you can’t fully trust any social media platform to remove images that may feature your child, there are steps you can take to remove revenge porn or other photos of your child from being shared without their consent.
Other sharing platforms such as Twitter, Instagram and Flickr also have processes to remove images posted without consent – view our guide on Fighting Revenge Porn. As a first line of defense, consider installing Net Nanny parental controls on your family devices, so when inappropriate language or searches are detected, parents will receive a warning. In this new digital age where children share everything good and bad, families need to stay diligent about keeping their kids safe online.